Sign in for exclusive products and special discounts.

6 Surprising Mathematical Back Stories

by Robert Black

Where do mathematicians get their ideas? How do they choose the problems they work on? Do they spend all their time in front of a chalkboard or a whiteboard or a computer screen until they think something up?

In fact, mathematicians get their ideas from all kinds of places. Some simply enjoy exploring different patterns and finding new relationships between numbers. Others take the work of other mathematicians and look for new ways to apply it or for variations that haven’t been considered. Still others are inspired by specific events or by someone or something in their lives that draws their interest. New mathematical breakthroughs can have some surprising origins.

Here are six fascinating mathematical back stories. (You can read about all of them in greater detail in our Mathematical Lives series.)

1. Probability theory was invented when a French aristocrat asked a friend for gambling advice. Specifically, the aristocrat was Antoine Gombaud, who wrote about proper behaviors and lifestyles under the pen name “Chevalier de Méré,” and the friend was Blaise Pascal, who by age thirty had already proved the existence of a vacuum and invented one of the first mechanical calculators. De Méré asked Pascal to solve a problem that had been debated for centuries: If two people are playing a game in which the winner is the first to reach a certain number of points, but the game is interrupted before either of them has won, what’s the fairest way to divide up the prize money? To solve it, Pascal (with some help from Pierre de Fermat) invented an entirely new field of mathematics—one that affects all our lives today, even people who don’t gamble.

2. Florence Nightingale became a pioneer of statistics when she set out to prove the necessity of improving hygiene in hospitals. Nightingale was England’s biggest hero of the Crimean War, but when she came home, all she could think about were the dead soldiers she had left behind. Disease had killed far more of them than the fighting had, and Nightingale was convinced that poor sanitary conditions in the hospitals had been the cause. But how could she prove it, and how could she convince the government of the need to improve sanitation systems in the army and for the general public? She found her answer in statistics. But at that time, statistical reports were typically found in large, dreary volumes called Blue Books, written only for government eyes. Nightingale wanted to make her case to a wider audience, and she did so by presenting her numbers in the form of diagrams, some of which she designed herself. Today, when you see a chart or a graph explaining some kind of statistical data, you have Florence Nightingale to thank for it. Nursing wasn’t the only field in which she was a pioneer.

3. The first (and only) African American mathematician in the National Academy of Sciences analyzed duels to help the U.S. win the Cold War. In the years following World War II, the U.S. military established the RAND Corporation as a think tank for imagining the new weapons and strategies needed for the Cold War. One regular summer visitor was David Blackwell, Math Department Chair at Howard University, the country’s premiere “Historically Black” institute. At RAND, Blackwell became known for studying the mathematics of duels, analyzing them in ways that could be used to model larger forms of combat. RAND was also where he first heard about a statistical method called Bayesian inference. Years later, as a professor at the University of California, Blackwell wrote the first textbook to teach statistics using the Bayesian method. For his work in several different mathematical fields, many of which have become important in computer science and machine learning, Blackwell became the first African American member of the National Academy of Sciences, and to this day he is still its only Black mathematician.

4. The first design for a computer was steam-powered, and the first program was designed by a woman. In the early 1800s, if you wanted to find the values for logarithms, trigonometric functions, square roots, and other functions, you needed to look them up in a book of mathematical tables. Creating these tables was a laborious process that required a number of skilled mathematicians. But in 1822, Charles Babbage began working on a machine to calculate values automatically, which he called the Difference Engine. A few years later, he began designing a more powerful machine, called the Analytical Engine, which would have been a steam-powered mechanical computer.

By 1843, an Italian engineer had published a detailed description of Babbage’s still-unbuilt design. A London journal wanted to translate the article into English, but Babbage was sulking after an argument with the Prime Minister over funding. So the editor turned to Babbage’s friend Ada Lovelace—and got more than he ever expected. In addition to the translation, Lovelace added a series of her own explanations and comments, including an outline for the first computer program and the first exploration of what computers could and couldn’t do.

5. Computer graphics and animation were revolutionized by the discovery of fractal geometry. To many of his colleagues at IBM’s Watson Research Center, it looked like Benoit Mandelbrot’s career had no direction or focus. For a time he was researching statistical economic models, even spending a year teaching economics at Harvard. Next he moved on to communications, developing a new method to simulate telephone signal errors that programmers could test against. And after that he moved on to geography, finding a relationship between the measured length of a coastline or national border and the size of the scale used to measure it. But Mandelbrot had seen something his colleagues had missed: all of the areas he studied had the same type of mathematical relationship at their root—one that involved raising numbers to fractional dimensions. Far from being aimless, Mandelbrot was developing a new field of mathematics, which he called “fractal geometry.” Today, fractal geometry has revolutionized computer graphics and animation, antenna technology, and much more.

6. The discovery of “The Butterfly Effect,” one of the twentieth century’s most important concepts, happened by accident. In 1960, MIT professor Edward Lorenz was trying to develop a computerized weather simulation so that he could test a statistical prediction method against it. While preparing for a conference, he wanted to rerun one of his simulations. Rather than wait for the entire calculation to run, he re-entered the results from a point in the middle of the previous run and used that as his new starting point. But when he came back from a coffee break an hour later, he found the computer producing completely different results from the time before. The new results had a surprising source. When Lorenz had re-entered numbers from the previous run, he was reading from a printout that had rounded them to three decimal places. The real numbers, held in the computer’s memory, had six decimal places. So what Lorenz had entered as 0.506, for example, had originally been 0.506127, and that tiny difference was enough to change the entire outcome. It was the first recorded example of “The Butterfly Effect”—and the beginning of the new field of chaos theory.


Robert Black grew up in Indianapolis, where his parents were both high school math teachers. He attended Park Tudor School in Indianapolis (where his parents taught) from kindergarten through high school. He graduated from Vanderbilt University with a bachelor’s degree in mechanical engineering and mathematics. He has been writing for children since the mid-1980s, when he worked on the Nickelodeon TV series You Can’t Do That on Television. He currently works as quality systems manager in California. He is the author of a series of biographies of mathematicians called Mathematical Lives.

Back to top