Algorithmic trading, also sometimes referred to as black-box trading, automated trading or algo trading, has developed from its beginnings as the computerization of order flow in the 1970s to a potentially crucial tool for brokers and traders. Theoretically algorithmic trading platforms can create returns at a speed and frequency that human traders can't match. High-frequency trading (HFT) has become the most pervasive use of algorithmic trading platforms, where complex analysis of multiple markets enables a machine to execute orders based on pre-configured conditions.
In 2014, more than 75% of shares on US-based stock exchanges were traded via automation. According to data from The Trade, the main motivation for algo usage among the buy side is the consistency of execution performance, while long-only firms engage with between two and three providers on average. Those with more than $50bn in assets under management use up to four providers on average.57% of respondents to the survey either used one system or five.
While algorithmic trading has tangible benefits in the creation of precise entry, exit and money management rules, and the removal of emotion from the trading process, the use of these platforms do not come without downsides. Deliberations from the US Securities and Exchange Commission (SEC) and the Commodity Futures Trading Commission (CFTC) laid the blame of the 2010 Flash Crash squarely at the door of automated trading systems.
Advantages of algorithmic trading
Quantitative finance models all work off the assumption that market prices and returns will evolve over time, often at random. Most quantitative models argue that the returns of a given security can be affected by either one or a series of randomised risk factors. A diversified portfolio could be comprised of exchange rates, short-term interest rates and the returns of stocks, on the assumption that the portfolio is less sensitive to market movements and has a greater chance of return.
Algorithmic and automated trading allows a trader to spread themselves across numerous accounts, strategies or markets at one time, at a faster and greater rate than done manually. This diversification can spread risk across a number of instruments and hedge against losing positions. Attempting to move across different channels simultaneously is a task almost impossible for a human to achieve, while a computer can accomplish it a fraction of a second.
Human traders, no matter their veterancy, can lose their discipline at the risk of either taking losses or at the possibility of fractional increases in profits. This comes to the fore in volatile markets, where a trader will attempt to second-guess the market to execute trades at the best time possible or sell when they believe a large loss may occur. An automated system prevents these knee-jerk responses by following its programmed plan.
Fat finger and pilot error is also eliminated through the use of an automated system. A panicked trader may accidentally enter a sell order of 1,000 instead of 100. An algorithm pre-programmed to sell certain amounts at certain market indicators would not make that mistake.
Fast order entry
A computerised platform can react quickly to market changes and generate orders as soon as certain criteria have been met. Being able to get into or out of a trade a fraction of a second before competitors can greatly affect the outcome for the broker. There is no risk with an algorithmic system that a trader will miss a chance due to being busy on a call, out at lunch or caught short chatting to colleagues. Algorithmic trading systems can process an order in 10 milliseconds or less. For comparison, it takes the human eye around 300 milliseconds to blink.
Back testing strategies
Computer models must be trained on sets of data, with a set of rules that have no interpretation room. An algorithm can’t go with its gut or switch strategies on the fly. Whereas new trade plans might have to be tested in real-world scenarios by human traders, an algorithm can be trained to run through historic data before going live. This allows traders to tailor their system, and also determine the success rate of their platform in live trading.
Disadvantages of algorithmic trading
Due to the rapid-fire nature of trades occurring through automated systems, market shocks can be transmitted rapidly across markets at a much faster rate. During the Flash Crash of May 2010, major US equities indices experienced a 5% plunge and rebound in a matter of minutes. The Dow Jones fell 1,000 points on an intraday basis. 20,000 trades in 300 securities were reported to have occurred at prices 60% below their usual rate.
While a CFTC report from 2014 concluded that high-frequency traders and automated systems did not directly cause the Flash Crash, they contributed to it by demanding immediacy ahead of other market participants and by reacting instantly to the shock to the market, where a human trader may have stalled. Moreover, some have also argued that high-frequency trading actually helped to mitigate the crash’s impact.
While the back testing of algorithmic trading platforms can be used to test its abilities prior to live trading, there remains a risk that a program might be over-trained to fit to certain trends. A platform might be trained on historical data and produce exceptional results in reacting to the trends from that data, but then be deployed to a modern market and struggle to create the same results due to the difference between the modern market and the data sets it was trained on. Traders can be drawn into creating what they believe to be a fool proof trading plan, which creates returns under very specific market conditions that may never occur again, and feel as if their system has failed when those returns are not replicated in the real world.
An algorithmic trading platform needs operational hardware during the execution of trades. Dedicated computers, servers and connections are needed to ensure the system runs correctly. While automated trading systems and algorithms may seem to be “fire and forget” platforms that a trader can set up and allow to run without supervision, in reality things are more complicated. Some platforms do not operate entirely via internet connections and data servers, and instead store impending trade orders on the client-side computer. In the event of a lost connection, that trade may not occur.
Intermittent power outages will also result in the non-execution of trades by the system. Exchange-based server crashes and glitches can also adversely affect the platforms. The 2012 IPO of social media giant Facebook was affected by one such glitch, resulting in a halt to all electronic trading at Nasdaq. Wall Street trading firm Knight Capital experienced a software glitch in its proprietary trading systems which caused the algorithms to trade erratically. In one trading session the firm lost $440 million.
Due to the risk of errors, glitches and power losses, automated trading systems require monitoring. Nasdaq recommends that traders set up monitoring and surveillance teams, trained to use both visual and audible alerts. It also suggests the setting up of a committee charged with reviewing practice and control levels on a regular basis, comprised of representatives from the trading, client coverage, compliance, risk, and credit teams. The implementation of such a strategy might be beyond smaller trading firms looking to automate without risks.