Physics needs math, so does Finance. Then no wonder some curious physicists began to create math models of financial markets in the 19th Century, only to find the task more difficult than imagined. Advances during the 20th Century ulitmately generated great wealth among a group of saavy traders known as the “Quants”. Their sophisticated trading models worked for a while until widespread use by firms in ‘Wall Street’ caused the global financial system to collapse in 2008. New models continue to evolve today. This book pays tribute to scientists who tackled the problem of modeling financial markets.
The evolution of market models
- Bechelier made the first price-distribution model of stock markets. His normal distribution only worked in the Paris Bourse where there was little variation of prices.
- Osborne postulated that returns, not prices, are normally distributed. His model of the ‘random walk hypothesis’ expanded the understanding of price variation.
- Mandelbrot claimed that distributions of financial markets are more variable than previously thought.
- Thorp, Black, and Scholes converted price-distribution models to algorithms for daily trading. Their options pricing model was adapted from Osborne. Black later described the shortcomings in his paper “The holes in Black-Scholes”.
- The Prediction Company of Farmer, Packard, and McGill used black box models to improve the Black-Scholes model. They capitalized on short-term inefficiencies in the market.
- Didier Sornette predicted market catastrophes.
Today’s market models are still imperfect!
Early Probability Theory, 16th-17th centuries
1526: Cardono wrote an unpublished book on the theory of probability based on the odds of dice games. For example, what is the chance of rolling 2 dice for a sum of 10?
- mathematical odds: 3 outcomes in 36 tries, or 1 in 12.
- betting odds: 33 to 3, or 11 to 1. Bet $1 and either lose it or win $11 plus the refunded $1.
1654: Pascal and Fermot established the modern theory of probability based on various gambling games. They realized that probability is a chance, not a certainty. In the 20th century, it was realized that the a probability becomes a certainty when taking an infinite number of chances [Law of Large Numbers].
Random behavior of market prices, 19th-20th centuries
Bacheleier invented mathematical finance in the late 19th century. His graduate thesis applied probability theory to market speculation. In his ‘efficient market’ theory, Bechelier assumed that future prices take a ‘random walk’ within limits that describe the graph of a bell-shaped curve. In other words, stock prices have a normal distribution with a stable average. Bechelier’s ‘random walk’ model spawned 2 books in the 20th century.
- 1947: Samuelson published “Foundations of Economic Analysis”.
- 1964: Paul Cootner published Bechelier’s thesis in “The Random Character of Stock Market Prices”.
1959: An astronomer named Maury Osborne wrote “Brownian motion in the stock market”. Osborne dismissed the idea that stock prices have a normal distribution. Instead, the rate of return was normally distributed. His plot of stock prices was not bell-shaped, but lump-shaped with a long tail to one side. Osborne was first to show the importance of the log-normal distribution of prices to markets.
1960s: Benoit Mandelbrot discovered fractal geometry and adapted the consequences to understanding markets. Mandelbrot’s method described extreme market events.
1965: The issue was whether to analyze stock prices with Osborne’s or Mandelbrot’s method of analysis. Today’s consensus is that rates of return are fat-tailed with an unstable average.
1973: Burton Malkiel adopted Osborne’s work in a book called “A Random Walk Down Wall Street”.
Bachelor, Osborne, and Mandelbrot neither traded nor earned profits; they were academics.
1961: Edward O. Thorp beat the game of Roulette with a successful strategy. He later showed that math models could earn profits from financial markets by operating a hedge fund. Thorp believed that the stock market is the world’s biggest casino. Buying stock is betting that the price will rise and Selling stock is betting that the price will fall. The true price of a stock is where the odds of winning and losing are equal. He devised the ‘delta hedging’ strategy of picking the right mix of warrants and stocks to consistently earn a 20% annual return. The idea was to simultaneously short-sell warrants and buy underlying stocks. The stocks would soften the impact of a bad bet and augment the impact of a good bet.
1967: Thorp co-authored the book, Beat the Market. Jay Regan, a stock broker, partnered with Thorp to create the Princeton-Newport Partners hedge fund.
1969: Fischer Black derived a relationship destined to become the Black-Scholes-Merten model for the pricing of options. Black made quantitative finance an essential part of investment banking.
1991: Physicists James Dayne Farmer and Norman Packard studied nonlinear forecasting. Given a chaotic process such as the financial market, their goal was to predict the next movement of prices.
1991: Farmer, Packard, and McGill formed The Prediction Company with the goal of profiting from Wall Street. They developed black box models of algorithmic trading which often worked for unknown reasons but also suffered unpredictable failures. It is still a mystery how market patterns are corrected.
1997: Didier Sornette, a geophysicist, studied the patterns of complex systems to predict critical events in the physical and social sciences. He filed a patent notice in 9/17/1997 that predicted a market crash the following month. Then he bought far-out-of-the-money Put Options to earn a 400% profit from his prediction.
2008: The economic collapse of 2008 presented an opportunity to change how economists think about the world.
2009: Smolin, Weinstein, and others convened a conference of intellectuals to develop new models of economics. They failed to agree on the problems and solutions.
All of the physicists’ models had successes and failures, but their works represent steps in the evolution of understanding markets. Financial modeling is an evolutionary process in which excellent assumptions can be destroyed by a change in market conditions. The realistic goal is to develop a model that provides a good answer at the moment. Why? One reason is that markets are evolving in response to economic growth, regulations, and innovation. Models ultimately fail!