Speaker Interview: Dr Elliot Banks, Chief Product Officer at BMLL Technologies
1. Please give us a little introduction on your current role and what you do
My name is Dr Elliot Banks, and I’m the Chief Product Officer at BMLL, responsible for product development and product delivery, working closely with both clients and development teams to deliver BMLL's data and analytics to clients. Prior to that I was Chief Data Scientist at BMLL, leading a team of 10 responsible for the data science computing environment.
BMLL is the leading provider of historical Level 3 Data and analytics to banks, brokers, asset managers, hedge funds and global exchange groups. BMLL’s Level 3 data and analytics enable market participants to truly understand market behaviour across several asset classes and generate alpha more predictably. BMLL Level 3 Data provides full transparency of the order book, derived from every single insert, modify, execute or delete order message across every venue. BMLL clients access granular data to examine the behaviour of each individual order, including order fill probability, order resting time and order queue dynamics.
Last year, I drove the development and launch of BMLL’s data visualisation product, BMLL Vantage, for EU and US equities and ETFs. BMLL Vantage allows users to query BMLL’s Level 3 order book data without having to code so that everyone has the opportunity to understand liquidity dynamics, carry out venue comparisons or make order routing decisions.
Don't miss new reports! Sign up for Quant Strats Newsletter
2. What do you consider your biggest professional achievement to date?
I think the real achievement is that we at BMLL took on a problem that had previously been addressed by only a handful of the most sophisticated firms in the world. BMLL took full-depth Level 3 data across multiple venues, and made it easy and accessible for end users to run and turn that into alpha. The ability for BMLL to build a product that democratises the Level 3 dataset means that any firm can access that data and turn it into insights. As a result, Level 3 data is now no longer solely the preserve of those few ultra-sophisticated firms, and can be accessed by anyone who wants to leverage the predictive power and insights of this dataset. If I look back over the last six years, that's something I'm incredibly proud of.
3. What do you think are the biggest challenges facing data scientists/AI experts/quantitative practitioners in 2023 and beyond?
I think the challenge for everyone in the world of data science, analytics and quantitative trading, has been and will continue to be data. Expensive quant models are useless without high-quality, clean, consistent data that is easily accessible and usable. That is a theme we have seen over the past few years and are continuing to see with the likes of ChatGTP, for example, in that a chatbot is only as good as the level of data its programmers are able to analyse. Now what we’re seeing is a race for good, clean, quality data - and not just data in a completely unstructured format - but actually that is usable and allows people to quickly derive insights from. That’s the thing that's going to be driving every quantitative firm across the planet.
4. What is your advice to funds hoping to get new systematic strategies into production quickly and more often?
For any sort of trading strategy, you need to get your hypothesis and ingoing data, test it, learn from that test, and then feed that back into your hypothesis - and continue doing that until you get a result that has a sufficiently high Sharpe ratio that it does whatever you need it to do, whether that’s best execution or alpha. The best route to accelerating deployment of your strategies is really about having usable sets of data in an environment that is quick and easy to use. The quicker a quant is going to go through the iterative design process, the quicker it will be able to query and actually deploy into production. For me, it's all about having that data and usability to be able to rapidly test a hypothesis. Once in production, you can monitor performance and recalibrate strategies accordingly. By having a clean, easy, quick data access, hypotheses can be turned into actionable results, whether that's for best execution or alpha.
Speaking at Quant Strats
Dr Elliot Banks, Chief Product Officer at BMLL Technologies will be speaking at Quant Strats on March 14 participating on the panel: Inflation, uncertainty and market risk – future-proofing your quantitative strategy taking place at 9.05 am. You can also join him on one of our Birds of a Feather tables where you can exchange ideas, challenges and solutions on key themes within Quant Investing with your peers, over a glass of bubbly. To find out more about Elliot's panel and the full program download the agenda here.
5. How do you think firms can adequately utilise historical market data, given the scale and size of the data engineering challenge?
Across a number of funds there is a desire for an ever-increasing granularity of information, yet also a challenge with regards to the enormous scale of today’s market data. One order book in the US might have 20 million updates on a large day. Multiply that by the thousands of order books trading across the different markets on US stocks, and then add on other asset classes and jurisdictions … all of that data becomes incredibly difficult and time consuming to manage successfully. This translates into a race to understand not only what is going on in the market, but to find good quality datasets and get that data into a usable form.
Typically, we would advise firms to examine what data needs to be kept in house, and what should be moved out. When it comes to market data for example, does your firm need to hold a copy of a historical dataset that is public domain? We also suggest firms look at their technologies to optimise quick and easy access to data so they can focus on the output of their quants by sourcing good quality sources of that data, rather than the exercise of cleaning the data.
BMLL focuses on historical market data, and makes it as simple as possible for firms to access that data without having to worry about protocol updates or format changes from various different venues. Instead of having to focus on this enormous job of cleaning a petabyte-scale data lake, data engineers who use BMLL data can focus on delivering insight for alpha or execution strategies, and turning that into production.
6. What have been your /your firm’s top 3 priorities for the coming year?
We were very pleased to have raised USD 26 million in our Series B funding towards the end of last year. The round was led by Nasdaq Ventures, FactSet and IQ Capital’s Growth Fund, supported by ACF Investors and other new and existing investors. The funding will support investment in acquiring new data sets globally and we will continue to add those datasets into our BMLL Data Lab product, which allows quants to access all of that data and run more extensive and deeper analysis across different regions and venues.
We are also planning on building out our BMLL Vantage product, which was launched at the end of last year. We want to add more analytics features in, more datasets, and more usability functionality to make it as easy as possible for data scientists and non-data scientists alike to be able to understand the predictive power of Level 3 data, whether for best execution or for alpha generation purposes.
Last but not least, we are also building on the existing presence in North America. We recently hired industry veteran Rob Laible as Head of Americas and Jenny Chen as Head of Sales (Americas); and will be looking to open a US office in the near future.
Don't miss your chance to hear from Dr Elliot Banks, Chief Product Officer at BMLL Technologies on March 14, register here today!