Algorithmic Trading and Transaction Volume Implications
Algorithmic trading harnesses the power of computer science and quantitative finance to ideally trade large volumes of securities in a way that is imperceptible to the wider market. As The Times Online of London reports in the article “Traders turn to stealth to stalk world markets” by Nick Hasell, the best way to remain anonymous is to slice up large orders into many small ones and drip feed them into the market liquidity at intervals governed by a software program. “The most frequently used analogy is of dropping hundreds of pebbles into a lake rather than a boulder, creating faint ripples rather than a big splash.
This breaking up of larger orders into many smaller ones has caused the value of the average stock trade in both America and London to decrease in size while the number of transactions has soared. As the article states, “Investors are trading smaller blocks of stock more frequently. The average trade size on SETS, the London Stock Exchange’s electronic order book, has fallen by two thirds over the past five years, from £61,760 to £20,145. Over the same period, the annual volume has multiplied more than fivefold, from 8.6 million trades to 46.9 million. (For American readers, the screen-based SETS replaced the telephone-based SEAQ quote-driven system in 1997.)
Peter Sheridan, head of algorithmic trading for Europe at Goldman Sachs, is quoted as “We want to make ourselves as invisible as possible.” As a result, “some algorithmic trading programs will buy back quantities of the shares that they have just sold — or vice versa — to cover their tracks. Such techniques, pioneered on Wall Street, have been embraced in London. Goldman Sachs, Credit Suisse First Boston, HSBC and Morgan Stanley all have dedicated algorithmic trading teams in London… Mr. Sheridan highlights two reasons for algorithmic trading. First, fund managers must focus on transaction costs after the Myners report for pension funds said that stockbrokers should “unbundle” the cost of trading from that of investment research within their fee structures. Secondly, the European Union’s Markets in Financial Instruments Directive (MiFID) is due to come in next year and its “best execution” tenet means that stockbrokers must show that they have achieved the best price for their clients.”
The article also references The Tabb Group, the Boston-area financial consultancy, which estimates that “11 per cent of American share trading in 2005 year was handled by algorithms and predicts compound annual growth of 34 per cent over the next two years. Estimates for London suggest that, while the equivalent figure may be 5 per cent or less, the rate of growth should at least match that in the United States.”
Toomre Capital Markets likewise expects algorithmic trading to grow in all markets in which there are liquidity pools. Hence, one should expect further growth in stock markets around the world, the fixed-income markets and foreign exchange markets. As the volume of the resulting trades also increase, there will be even more pressure on legacy systems and the need to process transaction data in near real-time. In such an environment, what do you think is needed in the future of real-time enterprise risk management? We have written a monograph that expands on our views and welcome your comments.