Can the Market's Systems Keep Up With Electronic Trading?
The Wall Street & Technology blog asks Can the Market's Systems Keep Up With Electronic Trading? That is exactly the same question that Toomre Capital Markets LLC ("TCM") has been asking for the several months with the forthcoming implementation of Regulation NMS on March 5th 2007. And the answer on both February 27th and 28th 2007 at the New York Stock Exchange was a resounding NO!!!
Many people do not fully appreciate just how the shifts in the United States equity market structure are impacting the systems technology for market making systems. As the average size of a completed transaction on the NYSE has shrunk from something like 3,000 shares per transaction in 2000 to perhaps under 400 shares per transaction today, the total volume of shares traded has continued to grow.
This has meant that the actual number of transactions has expanded greatly. TCM has not gone back to look up the actual statistics, but recalls that there were something close ten times transactions in 2005 as there were in 2000 for NYSE traded stocks.
This great expansion in transaction volume (as opposed to number of shares traded) has been in large part been a result of algorithmic trading programs taking an institutional order for say 10,000 shares and breaking it down into many smaller orders of say 200 shares here, another 300 shares there, and so on. The quantitative programs track the execution instructions to ensure that the desired amount of shares are actually transacted (or the order is pulled by human intervention). The result for this example 10,000 share sale order is as many as hundred individual transactions.
Each of those transactions involves a number of execution steps to actually complete a trade with a counter-party. The number of steps varies by whom one references, but at a minimum the trade process involves checking whether the indicated bid or offer is still there in sufficient size and quantity, confirmation that the trade and quantity are indeed desired, the actual matching of buyer and seller, the reporting of the trade back to each party to the transaction, and generation of the trade confirmation. Each of these steps involves the processing of an electronic message (or as some might say an "event").
On Tuesday February 27th 2007, , the Dow Jones computers calculating the Dow Jones Industrial Average backed up as the message queue of completed trade information apparently got full and they could not process the front of the queue fast enough. The sharp 200 point drop in the DJIA that happened around 3 PM Eastern Time apparently occurred as a back up system calculated the DJIA based upon most recently completed price information rather than calculating the average as each individual component varied with sequential trade information.
The NYSE trade processing systems has similar message queue problems. The number of completed transactions simply could not be processed fast enough. Apparently as a result, some trade instructions were then routed to other trading venues including NYSE Arca and there were reports of broker/dealers receiving back wrong or incomplete trade reporting and confirmation information.
Many are puzzled by how this could happen, especially as the securities industry has spent two years and millions of dollars preparing for the great Regulation NMS environment. The key problem is that message data rates have exploded. Take, for example, the messages associated with equity options trading. In March 2000, the Option Price Reporting Authority was processing 3,000 messages a second. In 2006, it was up to about 150,000 per second and is projected to expand to 450,000 per second in 2007 and more than 700,000 per second in 2008.
This is just a tiny bit of expansion as the various algorithmic trading systems post bids and offers in one cent increments. Of course, not all price quotes are executed against. The electronically submitted bids and offers frequently are canceled or adjusted. However, one can gain some glimmer of why the price quote message queues have expanded so.
The problem that arises in many trade processing systems with such high data rates is the time that it takes to physically write the trade information out to disk. Many legacy systems are designed first to write the new message to some type of data base like VSAM or DB2 so that it is recorded for posterity. The systems then read the information back from disk along with other linked data to do something with the message like generate a trade report that gets sent to the buyer and the seller. The moving of the physical read/write head of a disk drive takes time. Often this movement is a fraction of a second, but it is considerably slower than the speed of messages traveling down a fiber optic communication cable.
This physical movement problem is the reason that many leading financial organizations are turning to a concept called Complex Event Processing ("CEP"). The leading firm in this particular field is another client of Toomre Capital Markets LLC, Streambase Systems out of Lexington, Massachusetts. TCM recommends Streambase highly for today and tomorrow's coming complex world of electronic trading.
Streambase Systems has built upon the ground breaking work of M.I.T. computer science professor Michael Stonebraker (Streambase's founder and Chief Technology Officer) to deliver a remarkable new approach to processing and analyzing real-time streaming data. Rather than first writing the data to disk, the Streambase processing engine allows applications to in essence pick key pieces of information out of the streaming message queue and do something with that event information.
Using a very flexible extension to the standard SQL language called StreamSQL, organizations are able to build structured queries that calculated on-the-fly values such as the VWAP (Volume Weighted Average Price) of all transactions completed in a particular stock since the initial 10,000 share order was received. There is no first writing of the message to disk. Hence, organizations can gain a significant competitive advantage in the processing of streaming data like market price data and transaction reporting.
Is it no wonder that In-Q-Tel, the private venture capital firm founded and funded by the C.I.A., recently completed an investment in Streambase Systems? Perhaps the intelligence community might be using CEP to make sense of all of the streaming messages and information flowing around say the Internet? Perhaps?
Toomre Capital Markets suggests interested readers check out Streambase Systems. This news event highlights how many custom-coded and electronic trading systems are not equipped to process the increasing quantities of real-time data crossing the wire. Will your financial firm be ready for the next big day of extreme market movement? Surely that day is coming… The only question is how soon. Reader thoughts and comments are welcome.