Investors throughout history have been utilizing technology to try to get an informational edge in markets. This was true in the famous story of Nathan Rothschilds use of carrier pigeons to get information about the outcome of the Battle of Waterloo ahead of other traders. It was also true in the more recent latency race among high frequency traders who moved from laying fiber between Chicago and New York to funding microwave towers for microsecond decreases in latency. Until recently, the ability to use technology to gain an informational edge in the fundamental analysis of single stocks has been relatively minimal.
Even as quantitative investment strategies grew their assets by hundreds of billions of dollars, fundamentally focused funds tended to use the human aggregation of data as the core of their analysis processes. Portfolio managers and research analysts pore over corporate filings and earnings calls, speak to expert networks, and meet with management, to build a mosaic of information; from which they extract their forecasting edge related to a company’s equity or bonds. We have recently started to see this process evolve. Instead of solely focusing on human produced insights, some portfolio managers and researchers now are incorporating content produced by machine-learning and true Artificial Intelligence based systems. Hardware and software advancements, combined with the emergence of many large, relevant datasets, have made it possible to use machine learning based tools to gain insights from both quantitative and more recently text-based financial data. As these tools continue to improve, and access broadens, we expect that the fundamental investment processes too will evolve to incorporate machine generated insight.
At Intellectus we are deeply committed to the development and implementation of a variety of Machine Learning techniques to benefit our clients. While we believe in factor based models and Quantitative analysis in investing, Artificial Intelligence is a different approach. It is utilizes philosophical models more than just mathematical ones.
These changes to fundamental investment analysis are being driven by the parallel development of several tools over the past few years. Cloud-based storage and compute have simplified the process of maintaining datasets and scaling compute based on the demands of the users. Over the same period, alternative data sets have become more readily available; likely related to the increased ease of use facilitated by the cloud compute networks. Software libraries have emerged to simplify the processing of analyzing large datasets. Some of these libraries are facilitating the development of machine learning and neural network-based models. Possibly more obscure than the first three, open-sourcing of word vector libraries for use in Natural Language Processing (NLP) enable the rapid analysis of terabytes of text related to a company. Finally, Computer Vision techniques are turning images of everyday life into highly valuable datasets. These tools have been evolving in a relatively coordinated fashion and are intrinsically linked to an array of problems that are being solved both in financial services as well as in other areas of the economy.
Cloud services from Google, Amazon, and Microsoft have emerged and evolved over the past few years. In addition to outsourcing compute and storage, specialized processing has emerged with the ability to source machine-learning specific processors like GPUs and Google’s TPUs. TPUs (Tensor Processing Units) have been available for training and running deep learning-based models on Google Cloud since 2017. The availability machine-learning specific processors at a minimal hourly rate, vastly changes the types of analysis that can be done by investment managers.
Cloud computing and storage has paved the way for the emergence of large alternative datasets. Formerly, large datasets had to be stored on-site and updated on a regular basis. Now a large dataset can be housed in the cloud, access can be facilitated by api keys, and processing can be sourced either from the cloud or locally. Data sets like Yodlee (credit card), Alexa (for web data), Orbital Insight (Geospatial data), and many others, have given fundamental investors the tools necessary to improve earnings prediction and reduce estimation error for a wide array of companies.
The development of open-source software libraries like TensorFlow and GloVe (Global Vectors for Word Representation) have decreased development time for machine learning based analysis. Researches now can use analysis-specific software libraries on cloud-based machines, while incorporating and novel datasets. Libraries like GloVe enable the large-scale analysis of text data related to specific investments. This text data can be correlated to changes in fundamental data, technical data, and stock price data. The combined effect of these innovations is to enable the large-scale analysis of financial text and quantitative data. Financial analysis can now be augmented by machine-based output. The value proposition is one where it’s far more logical to incorporate machine-base output into the investment process, rather than “wait-and-see”. The potential edge available from machine-learning in finance is significant. Similarly, businesses that choose not to incorporate machine-learning, expose themselves to significant risk of losing their “alpha” edge.
Just as the pervasiveness of the internet has forced all companies to adopt and "become" "internet" companies. All Investment Managers and Advisors will integrate AI deeply into their processes. We expect broadening adoption as it’s a relatively simple choice, either use machine-learning based output to benefit your investors or at worst add little value, or chose not to and risk losing your edge.
Head of Portfolio Structure, Risk and Alternative Data