In 1998, a computer science Ph.D. student named Larry Page filed a patent for an Internet search based on an obscure piece of mathematics. The method, known today as PageRank, made it possible to find the most relevant web pages much more quickly and accurately than ever before. The patent, initially owned by Stanford, was sold in 2005 for shares worth more than $1 billion today. Page’s company, Google, has a net worth of more than $1 billion.
It was not Page, nor Google co-founder Sergey Brin, who created the math described in the patent. The equation they used is at least 100 years old and is based on the properties of matrices (mathematical structures similar to a spreadsheet of numbers). Chinese mathematicians used similar methods more than two millennia ago. Page and Brin’s idea was to realize that by calculating what is known as the stationary distribution of a matrix that describes connections on the world wide web, they could find the most popular sites faster.
Applying the correct equation can suddenly solve an important practical problem and completely change the world we live in.
The PageRank story is neither the first nor the most recent example of a little-known piece of math-transforming technology. In 2015, three engineers used the idea of gradient descent, which dates back to French mathematician Augustin-Louis Cauchy in the mid-19th century, to increase the time viewers spent watching YouTube by 2,000%. His equation transformed the service from a place we went for some fun clips to a huge consumer of our viewing time.
From the 1990s onwards, the financial industry has been built on variations of the diffusion equation, attributed to a variety of mathematicians, including Einstein. Professional bettors make use of logistic regression, developed by Oxford statistician Sir David Cox in the 1950s, to ensure they win at the expense of less mathematically savvy bettors.
There are good reasons to hope that there will be more multibillion-dollar equations: generations-old mathematical theorems with potential for new applications. The question is where to look for the next.
Some candidates can be found in mathematical work in the latter part of the 20th century. One comes in the form of fractals, patterns that are self-similar, repeating themselves on many different levels, like the branches of a tree or the shape of a head of broccoli. Mathematicians developed a comprehensive theory of fractals in the 1980s, and there was some enthusiasm for applications that could store data more efficiently. Interest died out until recently, when a small community of computer scientists began to show how mathematical fractals can produce the most amazing, weird and wonderful patterns.
Another field of mathematics still looking for a lucrative application is chaos theory, the best-known example of which is the butterfly effect: if a butterfly flaps its wings in the Amazon, we need to know it to predict a storm. in the North Atlantic. More generally, the theory tells us that to accurately predict storms (or political events), we need to know every little disturbance in the air across the planet. An impossible task. But chaos theory also points towards repeatable patterns. The Lorenz attractor is a model of the weather that, despite being chaotic, produces somewhat regular and recognizable patterns. Given the uncertainty of the times we live in, it may be time to revive these ideas.
Some of my own research has focused on models of self-propelled particles, which describe motions similar to those of flocks of birds and schools of fish. I now apply these models to better coordinate tactical formations in football and to explore players moving in ways that create more space for them and their teammates.
Another related model is current reinforced random walks, which capture how ants build trails and the structure of slime mold transport networks. This model could take us from today’s computers, which have central processing units (CPUs) that perform calculations and separate memory chips to store information, to new forms of computing in which computing and memory are part of the same process. Like ant trails and slime mold, these new computers would benefit from decentralization. Difficult computational problems, particularly in AI and computer vision, could be broken down into smaller subproblems and solved more quickly.
Whenever there is an innovative application of an equation, we see a wide range of imitations. The current rise of artificial intelligence is primarily driven by just two equations (gradient descent and logistic regression) put together to create what is known as a neural network. But history shows that the next great leap forward does not come from repeated use of the same mathematical trick. Instead, it comes from an entirely new idea, read from the darkest pages of the math book.
The challenge of finding the next billion dollar equation is not simply knowing every page of that book. Page identified the right problem to solve at the right time, and persuaded the more theoretically inclined Brin to help him find the math to help them. You don’t have to be a math whiz to put the topic to good use. You just need to have an idea of what the equations are and what they can and can’t do.
Mathematics still keeps many hidden intellectual and financial riches. It is up to all of us to try to find them. The search for the next billion dollar equation is on.
David Sumpter is a professor of applied mathematics at Uppsala University, Sweden, and the author of The Ten Equations that Rule the World: And How You Can Use Them Too
George is Digismak’s reported cum editor with 13 years of experience in Journalism