Why Big Data changed dealing for good.
Over the last twenty years, automation has driven a true evolution when it comes to determining the structure of a modern trading floor. Some commentators have expressed concern that this ‘changing of the guard’, seen as voice-based traders have been largely displaced by legions of quantitative analysts who build sophisticated models to manage risk, leaves liquidity providers exposed. Consequently, when monetary policy does start to normalise and we see an accompanying rebound in sustained volatility, questions have been asked if the rocket scientists will find themselves out of their depth, unable to outsmart counterparties and running the risk of precipitating market-wide, systemic failure. Artur Deliergiev, Algorithmic Quantitative Trading Manager at CMC Markets, looks at what’s changed – and why he believes it’s all very much for the greater good.
Q: So the trading floor has changed - and irreversibly?
A: Yes, that’s absolutely the case. The idea of a dealer barking into multiple phones – or simply directly at his colleagues – whilst staring at an array of screens has, aside from one or two exceptions, now been consigned to history. Looking back, the inefficiency was startling and as with any human based process, the risk of error was significant, too. It’s worth pointing out however that this didn’t just happen overnight, is more the culmination of what has been manifesting itself for decades, but automation – along with advances in technology – mean we are now capable of accurately handling significantly higher transactions volumes than would ever have been possible in a voice-based world.
Q: How is your team constructed to ensure you take advantage of all the data available?
A: The key point here is that historically the trading desk was supported by the analysts. Traders called the shots and the analyst’s response was to interrogate the available data in the best way possible. That situation has now been changed fundamentally as we now have Quant traders – that’s traders who can code – and they’re in turn supported by a larger team of Quant researchers. The fact they share a common language means that the two sides understand one another, work together incredibly well and can resolve problems far faster than used to be the case.
We’re constantly working both with that significant historical data bank we have at our disposal, live pricing as we ingest more than two billion ticks per day and the outstanding risk book. We use this information to continuously finesse the risk management strategies we’re deploying, producing the best outcome for all stakeholders. Achieving that goal still requires a degree of traditional relationship management, but that now comes at the end of the process, rather than at the beginning.
Q: What happens when monetary policy normalises, and the market jumps back to life?
A: There are definitely some out there who are of the belief that the intricate models designed by the quants will struggle when volatility does accelerate, but there are two key reasons why we can confidently say these ideas are misguided.
Firstly, the underlying technology is so advanced that it allows us to have pricing relationships with many different parties, far more than could ever be maintained in real time by individual dealers. We’ve spent years fine-tuning these models to ensure they can adapt when market conditions change.
Secondly, the automation aspect allows us not only to look across the underlying cash market but also gives us the ability to synthesize prices from futures and options too, in real time and then decide what sort of liquidity we are willing to make available. This has already been tested with the threatened dislocation in the Gold market from March 2020, when a number of the legacy liquidity providers were unable to make a price. A small pool of tech-driven NBLPs – including ourselves - did however have access to price sources which in turn offered a reflection of the physical market, something which allowed pricing to be maintained.
Comparing the threatened dislocation to the equivalent performance during the removal of the CHF/EUR peg back in 2015 again gives some illustration as to how far this market has advanced in recent years.
Q: With this exceptionally long period of low volatility, how can you be confident that the models you’ve devised will cope?
A: We have a significant degree of confidence across the industry that the work undertaken to date will be able to accommodate erratic market movements. This is all about deciding how to manage the risk on our book and ultimately, automation doesn’t care if a move is 25 pips or 250 pips. There’s no risk of the technology being startled, it will simply assess the situation and respond accordingly within the pre-assigned parameters. It’s also important to remember that we have spent years fine tuning these models – at CMC Markets we’ve been building tick-by-tick data sets across every instrument for the last decade, so by constantly monitoring our pricing performance and using this to add to our knowledge base, we truly have a robust system.
Q: To what extent does having retail order flow help or hinder your ambition here?
A: I think this is absolutely critical and gives us a significant edge. For too long there has been an opinion that the retail order flow is insignificant and as such, those working it are unnecessarily distracted. This couldn’t be further from the truth. Not only does internalisation of flow at some of the legacy liquidity providers reduce market depth, but retail order flow gives us additional liquidity at price points that the institutional market finds very attractive. It also means that we’re interacting with more market participants which helps with price construction, especially when we see abnormal market events, both at top of book and further down the ladder.
Q: And finally, what advice would you have for any fund, manager or brokerage looking to build out its quantitative analysis function?
A: I think the key point to remember here is that it takes a significant amount of resource to be able to do this successfully on your own. To start you’re going to need the base data to work with, and especially if that’s coming from an exchange, it’s going to have a price tag attached. You then need the capacity to interpret that data, build out the appropriate models and understand how to deploy that within your risk management or trading strategies. To that extent we assist a number of institutional customers with these services and our fundamentally different approach here means that those counterparties who work with us invariably find they benefit not just from our combination of huge data volumes but also the specialist knowledge. Combine this with our consultative approach – rather than working on the basis that one solution works for everyone – and the overall proposition is a genuinely powerful one.