Systematic trading is increasingly quantitative in nature, and the evolution towards mathematical computations in the buying and selling of securities has placed an emphasis on the seamlessness of the technology that enables the business.
“Trading groups are becoming much more quantitative and much more sensitive to the all-in-the-box approach,” said Joe Signorelli, managing director at Lime Brokerage, whose Strategy Studio software suite allows for end-to-end strategy development. “They need the ability to back-test data and have reliable live data, so they can really just do their quantitative work and go out to the market with their strategy.”
“The business has changed quite a bit from the 90s and the 2000s,” Signorelli told Markets Media. “It’s different because you have to really focus on the microstructure of the market and the quantitative approach to win in the business right now.”
Systematic trading can be defined most simply as trading that follows an organized and fixed set of rules; the idea is to remove any emotional element from trade decisions and minimize the risk of human error. A quantitative overlay entails computer models based on technical analysis of market data or fundamental economic indicators.
Trading firms that deploy systematic quant strategies have substantial technology and infrastructure needs spanning the trade-life cycle. These needs include development and back-testing across asset classes and trading platforms, deployment of automated intraday trading strategies through any data center, optimization and scaling of strategies, and performance monitoring and risk management.
Once a firm’s trading moves into the realm of statistical models and algorithms, “you need data, and you need the tools and technology to allow you to do data analysis,” said Louis Lovas, director of solutions at OneMarketData.
Regarding the state of the art in systematic trading, Lovas highlighted three key aspects: (1) the capture, storage and normalization of historical data; (2) building and back-testing strategies; and (3) transaction cost analysis.
“You need to back-test individual strategies, but large-scale back-testing is equally important,” as that includes stress-testing and P&L measurement for trading, Lovas said. “Large-scale back-testing means you do P&L measurements not only across years of history, but across various incarnations or instances of those strategies, varying all kinds of parameters. You can eventually condense and summarize the results into what is making money and what isn’t making money.”
Practitioners of systematic quant trading can opt for an end-to-end, pre-trade through post-trade technology system, or choose a la carte offerings. “Probably the biggest benefit of end-to-end is that there is that single point of contact back to the vendor when you need help or enhancements, or when things go wrong,” Lovas said. “With the a la carte model, you essentially assume more of the responsibility of individual components, which may or may not work together. The effort and risk is on you to make them work together.”
Lovas noted that an advantage of a la carte is being able to choose ‘best of breed’ single-product offerings.
Top-shelf end-to-end trading technology does the plumbing work of merging order-book data from multiple sources and providing highly accurate trading simulation, according to David Don, managing director at Lime.
“For a long time, vendor solutions weren’t quite there in terms of speed and efficiency,” Don said. “But with solutions like Lime Strategy Studio that are single-digit microsecond latency and have a lot of depth in handling market data and simulation, that’s changing. Clients are more able to offload a lot of the proprietary work they do in terms of plumbing infrastructure, and focus exclusively on the logic of their strategy and how to more effectively leverage their development resources.”
For original version, please click here.