Algorithmic Trading On-Going Expense Surprises Some Dealers
On September 25th 2006, The Wall Street Journal published a (subscription only) article entitled Algorithmic Trading Inflates Costs written by reporter Leah McGrath Goodman. The thrust of this article is that global investment banks are underestimating the on-going expense of maintaining and further enhancing these expensive algorithmic trading systems. The article quotes one Daniel Clayden, a senior level executive at two top banks who moved from developing algorithmic trading systems to derivatives, who said: "Eventually, the investment banks will get pushed out of algorithmic trading," Mr. Clayden predicts, asking that his employers' names be withheld. "A lot of people thought, 'We'll build these machines, and then it will be done.' But they didn't realize the ongoing expensiveness of it, that it was going to get bigger and bigger as the market got more and more complex."
The creation of a quantitative algorithm which strives to complete a trade execution relative to some price benchmark is "relatively" easy (at least for those with technical engineering or mathematical educations). This price benchmark may be a constant, like a “closing price”, the “decision price” (price when trade decision is made) or the “arrival price” (price when order is initiated). Alternatively, it may be some calculated amount, like volume-weighted average price (“VWAP”) or time-weighted average price (“TWAP”). VWAP and TWAP algorithms are very popular as moving-average prices are intuitive and execution performance is easy to gauge. Arrival price execution algorithms, which incorporate more quantitative market risk measures, are growing in popularity because they are more reflective of the investment decision process.
Implementation shortfall quantitative algorithms are much more complex. In short, they weigh the risk of moving the security price against the urgency of filling an order. Consider needing to sell one million shares of some liquid technology stock like IBM (Symbol: IBM), Hewlett-Packard (Symbol: HP) or Advanced Micro Devices (Symbol: AMD). If the entire order is presented to the market makers at one time, the information about a "large" seller of stock is likely to depress the stock price and thus the transaction execution efficiency [as measured by Transaction Cost Analysis ("TCA")] is likely to be poor. Alternatively, one thousand individual trades of one thousand shares a piece could be slowly feed into the marketplace under some rule like that the sale volume shall not exceed more than 1% of all trading volume in that stock. If as a result considerable time elapses, the average trade execution price may deviate significantly from the decision price about which the liquidation process was initiated.
Implementation algorithms use risk management techniques to balance liquidity impact against opportunity cost. Should, for instance, the simple rule of no more than 1% of market volume be "broken" to quickly feed several 10,000 share orders into the market? Quantitatively answering such a question usually results in a very helpful "it depends." For instance, how liquid is the security and with what volatility does it trade? Answers to these key questions, which of course differ for each and every security at a particular point in time, will bias an implementation shortfall algorithm to one side or the other.
The really ornery and difficult problem comes when one expands these types of quantitative trading algorithms from an individual security to a basket of securities. Imagine, for instance, wanting to rebalance a portfolio of say $1 billion in market value of an index portfolio from the Dow Jones Industrial Average benchmark to the S&P 500 benchmark with minimal slippage. Some amount of Wal-Mart, Microsoft and General Electric will be in each portfolio. Which one and in what order should the relative sales and buys be made? What effect do the other smaller, but nonetheless important, smaller buys and sells in less liquid securities have on transaction execution efficiency?
Perhaps one might now appreciate why the WSJ article includes the following:
Algorithmic trading systems have become a must-have fixture for Wall Street banks and financial institutions. Using advanced mathematical models and computer systems to execute transactions, trades are split to make them smaller and, depending on market conditions, timed to secure the best value and reduce big swings in markets.
But assembling and maintaining the computer systems that craft and execute trades in a matter of milliseconds has proved to be more of a cash burn on banks and brokerage houses than first anticipated. Upkeep can exceed $5 million a year in addition to start-up costs of as much as $10 million. This may mean a change in business strategy for many, as companies realize that while algorithmic trading may help them make money on the trading desk, the technology often fails to earn its keep as a client-side service.
Algorithmic-trading strategies have existed since the 1980s. But conditions in European markets have only recently become ripe enough to take advantage of them, helped by a shift in favor of electronic trading, increased liquidity and improved access to market data.
Customer demand for algorithmic trading has reached such a fever pitch that brokers can no longer afford to admit they don't offer it, said Frederic Ponzo, managing director of NET2S Group, a London capital-markets technology consulting firm that works with 18 of the world's top 20 investment banks, as well as brokers and funds.
"I've seen brokerages tell customers they're providing algo trading even when all they have is some guy with a calculator doing all the trades himself," he said. "Ultimately, this transition is going to be very Darwinian: Businesses will either keep pace and evolve, or they will die."
Though banks likely will opt to maintain specialized algorithmic-trading capabilities at their proprietary trading desks, where they trade with their own money, Mr. Clayden estimates trade-execution services for external customers "probably have a shelf life of about four to five years at best."
The interesting question to Toomre Capital Markets LLC is when banks will admit that the on-going cost of cutting-edge algorithmic trading is just too expensive for them to further bear. When will they acknowledge that the cost of all of the mathematical types and computers do not satisfy the risk/return equations? When will they acknowledge having the newest computational toy does not really add much to true Economic Value Added? Your comments and thoughts re welcome.