Jacob Buitelaar is co-head of equity portfolio engineering & trading team at Robeco, based in Rotterdam. He spoke recently with Kurtosys, where he explained the hype versus the reality of applying big data to investment research processes; offered a refreshing take on the concept of diversity; and suggested the ability to narrowly customise a quant or factor-based strategy should lead to greater inclusion across a wide variety of client portfolios.
We are seeing an acceleration in the rise of quantitative models in recent years – what do you put that down to?
While quant investing has existed for a long time, and at Robeco we have been applying these insights since the 1990s, we are seeing it become more mainstream, which I think is down to a couple of things, but primarily I’d say it is down to education from quant fund managers, who are more open and getting better at explaining what they do. It’s really quite logical and something that everyone understands and can do. It is certainly no longer about the ‘black box’.
What about suitability? Are there certain types of clients that quant models are best suited to?
On the one hand we are seeing more uptake from retail clients but also within institutional, we are seeing interest from a much broader group of investors who appreciate the well-understood factors that have a lot of academic research behind them and are building up a decent track record.
For most clients a well-diversified basket of factors is just a really good investment. For more sophisticated investors, part of the appeal of quant is also the ability to customise in a transparent manner; giving you the building blocks you can then tailor in any way you want.
For example, we have a long-standing history in the sustainability space, so clients can give certain restrictions, for example they may want to look at companies focused on reducing their CO2 exposure. Because these models are so systematic, combined with our experience, for us it is relatively easy to integrate into one process, so setting the level of tracking error against a particular benchmark or adding a particular factor into their overarching portfolio, if they are lacking some value exposure, for instance. You can build a really tailored product that complements their existing portfolio in terms of diversification.
What are some of the key messages that you would need to communicate when marketing these types of strategies?
One is that as with any investment strategy, it’s not a panacea; even with a long track record of performance, there will always be drawdowns. Even if you’re investing in well-understood factors, they will not always work, so you need to be aware and understand you are committing for the long run. The other point is around transaction costs, which can directly reduce your alpha, so the more you can lower transaction costs, the better. And that’s not just about applying very clever trading algorithms but also comes down to portfolio construction.
More and more fund groups are talking about the increased use of big data and artificial intelligence and how they’re applied to either stock selection or in their research process. How are these emerging technologies being applied at Robeco?
This is where I spend a lot of my time. At the core, investment management is about processing data. We take in data from all kinds of sources and then try to turn that into insights, be those on generating alpha, making stock picks or sector allocations. That applies as much to fundamental managers as quant, just the way we both do things is different. A quant manager will try to apply their approach across the whole universe, while the fundamental manager will zoom into particular companies, but at the end of the day they are both trying to do the same thing.
Of course, you need a lot of talent to turn data into insights, but you need to feed those people with the right data and the right information. I don’t think that’s any different from what it was 20-30 years ago. It’s just that a lot more data is becoming available, which makes the job easier because there’s more information to use but then on the other hand, everyone has access to that information, so to be competitive, you need to do that better.
There are stages to doing it properly: it starts with collecting relevant data – while all asset managers have a data management department for operational processes, this is about data needed to make investment decisions, which can sit everywhere: in reports, news or online channels.
First you need to collect all the data, then storing it, organising it and making it available; both for the users but also for the machines. That in itself is incredibly valuable, providing easy access to lots of relevant information.
Only once you have collected and organised the data, then you can bring in AI to analyse it and see if you can identify any hidden correlations or new ways of combining it, really providing insights to analysts and portfolio managers if there is value to be had there; or in quant, if you can build new signals that add to the quality of your factors.
Where does cloud computing fit into all this?
In the past it was incredibly hard to build these data lakes – places you can basically store all your data in a number of different formats, while having a computing platform on top of it to help you process it all.
But with the emergence of cloud providers, be it Amazon Web Services, Microsoft Azure or Google, which are all now competing very hard with each other in terms of infrastructure. Whereas once upon a time an asset manager would have to make huge investments into building the technology themselves, now you can effectively rent it, which is making technology more easily available to everyone.
You are reliant on the insights then, to give someone a competitive edge, if the data itself is more widely available?
Yes. The technology is not the major differentiator, it is essentially what you do with the data, and the quality of your research process that helps you generate alpha, which is why you need really smart people to be able to process it.
What are some of the challenges presented by these sorts of data-driven strategies?
At company level, there are really big challenges. In the past, you had an IT department that was responsible for maintaining and providing all the infrastructure on which the rest of the company runs. With the emergence of cloud computing and platform- or infrastructure-as-a-service, the barriers to entry become much lower. If it’s easier for a small team to manage that on their own, you can get a lot more speed, which is especially important for quants when you’re doing research.
Also, even though access to technology becomes easier, it’s still important to get it right. You need to think carefully about infrastructure design, choosing the right technologies and staying on top of the latest developments. And, because your data is in the cloud, security is incredibly important.
We hear about fund management groups hiring in engineers from other sectors, such as rocket scientists or F1 engineers. Are they really necessary, or able to provide an edge, or is it just good PR?
I think it can really help to diversify the skillset. Naturally, you’ll find a lot of people with econometrics backgrounds in investment teams, or people with quantitative or economic backgrounds; we tend to have all been trained along the same lines. As the application of data grows, there is a lot of value in seeing how people do things in other areas, so we have been hiring people from a mechanical engineering background, for example. They may not know about portfolio construction but they will know about modelling a robotic arm, so when it comes to problem solving, they can bring a fresh perspective.
It is really about diversity. You can think about diversity in many different ways but here we are talking about diversity of perspective, which adds a lot of value.
So much of our industry is focused on ethnic or gender-based diversity, yet as you point out, the broad concept reaches so much further.
More than ever we are open to actively seeking out people who have different backgrounds; whether you’re a quant investor, a techie or a fundamental investor, as things become more data-driven, there is benefit to bringing in people from more technical backgrounds, bringing in the skills that that might have been under-represented.
How much of data or AI in asset management is hype, and how much can you actually demonstrate additional investment benefits, or alpha?
There’s enormous hype around everything to do with big data and AI – not just in our industry, but across the world. The latest is self-trading machines that make better investment decisions than humans, for example.
But while we hear about it, we have yet to see anyone really being successful. The issue is that with ‘black box’ models, the only way to test them is to run in practice, and if 1000 test something, a couple will be successful over the long run, just by chance.
What I would say is that we are sceptical while trying to stay open-minded. That means that, in line with our investment philosophy, we don’t just look at empirical result, but also look at the economic rationale and focus on implementing this in a prudent and transparent way.
On the other hand, I think the fact that using more data sets than the traditional ones of accounting data, market data, analyst estimates etc will inevitably make improvements to your signals, or help you find new factors or signals you can add to your mix. Such as news sentiments, for example.
So many of the new data sets are correlated, but different from the things we’ve always used, so we are using but to enhance rather than replace, the traditional processes. We are seeing this very much as evolution rather than a revolution.