Saturday 12 November 2011

History of algorithmic trading




Computerization of the order flow in financial markets began in the early 1970s with some landmarks being the introduction of the New York Stock Exchange’s “designated order turnaround” system (DOT, and later SuperDOT) which routed orders electronically to the proper trading post to be executed manually, and the "opening automated reporting system" (OARS) which aided the specialist in determining the market clearing opening price (SOR; Smart Order Routing).

Program trading is defined by the New York Stock Exchange as an order to buy or sell 15 or more stocks valued at over US$1 million total. In practice this means that all program trades are entered with the aid of a computer. In the 1980s program trading became widely used in trading between the S&P500 equity and futures markets.

In stock index arbitrage a trader buys (or sells) a stock index futures contract such as the S&P 500 futures and sells (or buys) a portfolio of up to 500 stocks (can be a much smaller representative subset) at the NYSE matched against the futures trade. The program trade at the NYSE would be pre-programmed into a computer to enter the order automatically into the NYSE’s electronic order routing system at a time when the futures price and the stock index were far enough apart to make a profit.

At about the same time portfolio insurance was designed to create a synthetic put option on a stock portfolio by dynamically trading stock index futures according to a computer model based on the Black-Scholes option pricing model.

Both strategies, often simply lumped together as "program trading", were blamed by many people (for example by the Brady report) for exacerbating or even starting the 1987 stock market crash. Yet the impact of computer driven trading on stock market crashes is unclear and widely discussed in the academic community.

Financial markets with fully electronic execution and similar electronic communication networks developed in the late 1980s and 1990s. In the U.S., decimalization, which changed the minimum tick size from 1/16 of a dollar (US$0.0625) to US$0.01 per share, may have encouraged algorithmic trading as it changed the market microstructure by permitting smaller differences between the bid and offer prices, decreasing the market-makers' trading advantage, thus increasing market liquidity.

This increased market liquidity led to institutional traders splitting up orders according to computer algorithms in order to execute their orders at a better average price. These average price benchmarks are measured and calculated by computers by applying the time weighted (i.e. unweighted) average price TWAP or more usually by the volume weighted average price VWAP.

As more electronic markets opened, other algorithmic trading strategies were introduced. These strategies are more easily implemented by computers because machines can react more rapidly to temporary mispricing and examine prices from several markets simultaneously. For example Stealth (developed by the Deutsche Bank), Sniper and Guerilla (developed by Credit Suisse), arbitrage, statistical arbitrage, trend following, and mean reversion.

This type of trading is what is driving the new demand for Low Latency Proximity Hosting and Global Exchange Connectivity. It is imperative to understand what is latency when putting together a strategy for electronic trading. Latency refers to the delay between the transmission of information from a source and the reception of the information at a destination. Latency has as a lower bound the speed of light; this corresponds to about 3.3 milliseconds per 1,000 kilometers of optical fibre. Any signal regenerating or routing equipment will introduce greater latency than this speed-of-light baseline.

1 comment: