Quantitative approaches to equity investing continue to introduce new sources of alpha. Recent advances in computing power, such as machine-learning and natural language processing are generating insights in areas formerly reserved for human research.
This has enabled increased analysis of non-traditional data sets that may provide valuable investment insights and a competitive edge amongst active equity investors.
Quantitative investing as we know it today originated in the 1980s with the formation of several quantitative investment firms, many of which are still thriving today. This fledgling industry benefited from a unique set of conditions:
Fast forward to 2024, and portfolio optimizers now use sophisticated non-linear optimization algorithms, allowing them to handle complex functions and constraints more effectively. The algorithms have also become more efficient, using a technology called parallel processing that enables unprecedented scalability. This allows quant managers to optimize much larger portfolios with thousands of securities in their universes.
We are in a new era of technological advancement and would argue that most innovation in public equity portfolio management is taking place on the quantitative side. The convergence of computing power, novel data sets and new techniques is allowing portfolio managers to investigate and capture investment signals that were previously not available to them. Many of the new techniques are broadly categorized under "machine learning" (ML), a field of artificial intelligence that enables systems to identify patterns and make predictions from data, and also learn and improve from experience without being explicitly programmed. Below are some brief investment-related definitions for some of these technological advancements.
Novel data sets: Non-traditional sources of data from which investors can generate investment insights.
Natural language processing (NLP): The analysis of text data using computers to extract information from text-based regulatory filings and earnings call transcripts.
Large language models (LLM): AI-powered language models that can be used to query information from a large set of textual data or analysis.
Generative AI: Models that can generate new, original content rather than simply analyzing existing data or making predictions.
Cloud computing/GPU computing: Allow quantitative investors to access massive computational power that allows quants to achieve in hours what used to take days.
These machine learning technologies all contribute towards the production of investment signals, also referred to as alpha signals. Investment signals produce stock-specific alpha scores that quants use as an input into their investment models to predict future price movements. They are utilized as a component of - or an addition to - traditional academically supported investment factors, such as value, quality and growth.
Quantitative factors and investment signals are quantifiable characteristics or metrics used to assess the attractiveness of investing in a security.
One of the earliest known factors is the value factor, introduced by Benjamin Graham in his book "Security Analysis" (1934). Graham introduced into the collective consciousness of investors the idea of using a number associated with a company (e.g., price-to-earnings ratio) to make investment decisions.
In 1992, the Fama-French three factor model was introduced by Eugene Fama and Kenneth French, where they combined size, beta and value in a model used to predict stock returns. The following year, Narasimhan Jegadeesh and Sheridan Titman laid the foundations for momentum investing in a paper titled "Returns to Buying Winners and Selling Losers: Implications for Stock Market Efficiency".
In the years following, academics and practitioners discovered a plethora of factors, populating what we now call the factor zoo, falling into broad classifications of factors such as value, momentum, growth, quality and technical. Without question quants today still rely heavily on academically supported factors as inputs to their models. But they are increasingly utilizing machine learning and novel-data-set driven investment signals in their models. The following table compares some common established factors with a few examples of the newer investment signals used today.
The very broad range of newer innovative investment signals being discovered by quants is driving an increasing portion of the alpha within quantitative models and further increases the differentiation between quants. They can also define a quant's competitive edge. When researching new factors and investment signals, we believe good candidates for inclusion must have the following attributes:
The last point explains the continuous and never-ending search for novel sources of data and new investment/alpha signals. With such a proliferation of factors and investment signals, quantitative managers must carefully consider the implications of adding them to their investment models. Techniques to do this have also evolved over time. A crude approach is to equally weight the predictions from multiple factors, and then average them. Another simple but more effective approach is to use a linear regression model, which gives more weight to factors that have proven to be more predictive in the past. Today, practitioners use a variety of factor combination techniques, considering non-linear effects as well as interactions between factors. Portfolio optimizers have evolved to use sophisticated non-linear algorithms allowing them to handle complex objective functions and constraints more effectively. The algorithms themselves have also become more efficient, utilizing parallel processing to allow for unprecedented scalability to optimize much larger portfolios with thousands of securities in their universes.
Although perhaps counterintuitive, all of these rapidly evolving tools, data sets, and investment signals require more human oversight, not less. It is critical that quantitative investors apply their experience and expertise to the entire process to help ensure that their data and model outputs make strong fundamental sense and lead to sound investment decisions.
As we continue to push the boundaries the pursuit of alpha will continue, perpetuating the relentless search for new sources of insight and opportunity.