The History of Quantitative Analysis
Posted: February, 2020
Stock prices are a result of future expectations, based on analyzing information which has gone on for hundreds of years. The most common information has been on the actual companies and their financial statements. Although financial statements for public companies have been required since the 1930’s, they were used by the financially savvy to ascertain the value of a company long before that time. Combined with other observations about industries and consumer tastes, this financial analysis represents the basis of fundamental analysis.
Charts and records of prices of traded commodities and stocks go back thousands of years. Merchants along the silk routes kept records of traded goods to examine trends of prices, with the hopes of benefiting from these price trends. Today, market ‘technicians’ study a variety of market data, seeking patterns of behavior that give clues to future behavior.
Quantitative analysis is relatively new. For practical purposes, it is only about twenty-five years old, and is still not fully accepted, understood or widely used. First, what is quantitative analysis? It is the scientific study of how statistically certain variables correlate to stock price behavior. The key word here is statistical for it implies large sample sizes enabling an analyst to virtually “prove” certain behaviors in the stock market. The first work attempted along these lines was done in the seventies and maybe the sixties by academics on mainframe computers. The only historical data they had were prices and volume. Some early work in price momentum modeling was done at this time. It was slow, expensive and virtually confined to automating technical analysis.
By the early eighties, services were born that collected broader data that had more value to the analytical community. Spreadsheets were invented on PC’s that enabled individuals to model companies’ financial performance. The spreadsheet, however, did not lend itself to mass production over thousands of companies. Quantitative Analysis needed the convergence of three major developments before it could blossom and become commonplace.
The first development was the collection of the actual data. Firms such as Zacks and Ibes started collecting earnings estimates at regular intervals so one could see them change over time. Compustat was busy collecting financial data on thousands of companies and making it publicly available. Jeff Parker created First Call, a brilliant idea that electronically collected earnings estimates and other research from major brokerage firms in almost real time, and then downloaded it into his client’s computers. Databases became more complex and more accurate. The most important thing was that there was now historical data on more than just stock prices.
The second event was that this data needed to be easily manipulated so that one could model, backtest and create new variables from existing data without having to be a crack programmer. Writing code to do regression analysis was not necessary. Statistical analytics became part of the software. Zacks pioneered this area and has created a complex piece of software, especially in the backtesting area. Factset also has powerful software and there are many others today cropping up with analytical packages. This accelerated the creativity process. Models were quickly and easily tested, making the analysis more valuable.
The third event was the development and rapid computing power improvement of the PC. The desktop enabled people to work independently without tying up the major resources of the firm’s mainframe. I used to run quantitative software on a 486 chip, a very slow process, even with much smaller databases. As the power of the PC grew, it became quite adequate for the needs of the quantitative analyst.
By the late eighties, seminars cropped up where people would discuss techniques, their models and “what works.” These quickly died over a few years as people realized that “what works” is a very valuable secret. Successful quant analysts have gone underground. Most of them work for money management firms or the buy-side of brokerage firms. I have witnessed them disappear, never to be heard from again, as if they were kidnapped.
If their approach is so different from fundamental and technical analysts, what is it that these quantitative analysts do with all these statistics? What is the value added? The answer is twofold and the first lies in an age-old investment problem. When an investor is faced with a variety of alternatives, how does one go about choosing? How can one compare a large quality growth stock with rapid growth and a high P/E to a low priced value stock? How do you choose between a financial stock and a commodity company like a mining stock? How do you handicap a strong versus a weak balance sheet?
The quantitative analyst has the ability to compare different valuation methods and attributes and through statistical backtesting, optimizing their trade-off. The process can be automated so that the investor can look at thousands of companies simultaneously and in real time to compare relative attractiveness before all the prices change. This capability is very valuable to portfolio managers because it becomes the framework of the research process. It provides a discipline and focuses the fundamental and technical research effort on the stocks most likely to outperform.
The fact that this was new in the late 1980s and not pervasive meant that the market inefficiencies in the quantitative world compared to the other more mature disciplines were huge. Combining these heterogeneous elements into a creative screening model is not easy. Like chefs, there are all the data ingredients at everyone’s disposal but with the proper software, there can be a vast difference in the “cooking” produced by the analyst. There simply were not many quantitative analysts and it is the first and best ones that created the big excess returns.
Since then, quantitative analysis has become much more pervasive, decreasing the excess returns of the 1990s. Today, artificial intelligence and more sophisticated modeling have taken over. Math doctorates and large scale efforts are almost a prerequisite for competing in this approach.