Tom Kennedy, Global Head of Analytics, discusses the emergence of the Quants.
Long before the time we heard the phrase data science, there were a group of individuals in the financial community that were at the forefront of creating new modeling techniques called Quants. The popularization of the term data science has opened up new insights and a thirst for knowledge in machine learning and artificial intelligence. This has led to the establishment of new market offerings and opening up of new service models such as AI-as-a-Service.
Recommendation Engines, Spam Filters and Classifications system are examples of the main consumer products we interact with on a daily basis. In the financial services industry these tend to be more specialized data products. However they use similar techniques such as regression modeling, bayesian classifiers, cluster analysis and neural networks.
What if you could measure the state of market psychology in real time? You could see patterns as they emerge, not in hindsight.
MarketPsych Indices is an example of data science in practice in Financial markets as it consumes high volume sources of professional news and social media and converts this variety into manageable information flows that drive sharper decisions.
The indices are delivered as real-time data series that can easily be incorporated into analysis and decision processes – quantitative or qualitative. Beyond the challenges of dealing with data there is the domain understanding. In the above example it is applying modern psychology into the global financial industry.
For Quants to be efficient and effective they have to concentrate on their core strengths in analyzing signals. The typical profile of a Quant is one, which has good groundings in computer science, statistics, big data technologies and financial market knowledge to name a few skills. Financial engineering has become a multi-disciplined profession. Its becoming a balancing act to find the right talent.
Working smarter not harder
One of the most complex tasks a Quant has is exploratory data analysis. Before even starting to model the data there is the laborious tasks of reading data models and understanding the content. Content specialists play an important role in the efficient utilization of data. Sometimes people forget that content vendors provide support on data.
Data wrangling and data munging are pretty conspicuous-sounding terms. A simple explanation can be given that this is all the non-value added work that is done to collect, scrape, normalize and cleanse datasets. Having quants mining for sources of data is not the best use of their time.
Seeking alpha
Quants who trade intraday are consumed with timestamps. I never knew the importance of time until I entered the profession. Eye blinks can be an indefinite period of time. Having accurately time stamped datasets is of huge importance. Back testing strategies requires careful handling when dealing with in-sample and out-sample data. Quants have to be very mindful of over fitting their models.
Quantitative strategies trade on data that occurs at different frequencies. Some are interested in the most atomic level (Tick Data) while others prefer to have temporal data or bars (E.g 1 Min OHLC). The underlying DNA of bar data is always good tick data.
They are always looking for more factors to bring into their models. Mining forecasts, power outputs and shipping data are new sources being used to look at an ever-connected world. The relationship that content is playing with each other is becoming more central.
Bigger stakes
The lines are drawn in the sand between Quants and Data Scientists in other industries when it comes to the stakes. It must be said that both are concerned with prediction accuracy. However the error margins and lifespan of models differ greatly. Giving the wrong recommendation of a movie may be acceptable but on the other hand picking the wrong time to enter and exit a trade can have disastrous effects on a firms P&L.
Quants have to minimize their market impact and deal with an increased level of complexity. The main challenge is in dealing with the feedback loop their behavior creates on the data being modeled. Risk management and evolving regulatory requirements are increasing the compliance and risk efforts. Quants when implementing their strategies need to know what the size of their bets should be to avoid the risk of ruin. The margins are a lot tighter.
The Quantitative Analytics market landscape is changing and Quants are demanding more efficient services. Should they care what technology is underneath the hood if they get the service level they need? My opinion is no, as long as platforms are agile to offer new services as things evolve.
About Tom Kennedy
Tom Kennedy is the Global Head of Analytics and oversees the provision of analytical services needed for trading, investment, risk management and compliance. Tom’s professional career has been in the area of quantitative finance, having completed undergraduate studies in Computer Science and postgraduate studies in Finance.