


10 Pharmaceutical Scandals That Will Leave You Fuming

10 Expensive Infrastructure “Solutions” That Were Total Fails

10 Fictional Extinction Events

Ten Surreal Attempts to Bring Species Back from Extinction

10 Noteworthy Rock Bands That Don’t Have a Drummer

10 Times Regular People Built Unbelievable Things at Home

Ten Place Names You’ve Been Mispronouncing Your Entire Life

10 Events That May Well Be Signs of the Times

10 People Who Are Only Famous Because of Their Death

10 Scientific Estimates That Missed the Mark by a Mile

10 Pharmaceutical Scandals That Will Leave You Fuming

10 Expensive Infrastructure “Solutions” That Were Total Fails
Who's Behind Listverse?

Jamie Frater
Head Editor
Jamie founded Listverse due to an insatiable desire to share fascinating, obscure, and bizarre facts. He has been a guest speaker on numerous national radio and television stations and is a five time published author.
More About Us
10 Fictional Extinction Events

Ten Surreal Attempts to Bring Species Back from Extinction

10 Noteworthy Rock Bands That Don’t Have a Drummer

10 Times Regular People Built Unbelievable Things at Home

Ten Place Names You’ve Been Mispronouncing Your Entire Life

10 Events That May Well Be Signs of the Times

10 People Who Are Only Famous Because of Their Death
10 Scientific Estimates That Missed the Mark by a Mile
Science is built on hypothesis, experimentation, and refinement, but history is full of spectacularly wrong estimates made by brilliant minds. Some were optimistic projections that underestimated the complexity of discovery, while others were overconfident declarations that turned out to be wildly incorrect.
Whether due to bad data, technological limitations, or simply a lack of knowledge at the time, these 10 scientific estimates missed the mark by a mile—and in some cases, by entire light-years.
Related: 10 Little-Known Shifts in Computer Science
10 Lord Kelvin’s Terrible Estimate for the Age of the Earth
In the late 19th century, physicist Lord Kelvin was considered one of the most brilliant minds of his time, having made major contributions to thermodynamics and engineering. When scientists debated the age of the Earth, Kelvin confidently stated that it was between 20 and 40 million years old, based on how long it would take a molten planet to cool down to its current state.
Using complex heat conduction equations, he calculated that the Earth had started as a fiery mass and slowly radiated its heat into space. Many geologists disagreed, suspecting the planet was much older, but Kelvin’s stature in the scientific community made his estimate the dominant theory. The problem was that Kelvin was missing a key piece of information: the discovery of radioactive decay.
In 1896, just a few years after Kelvin’s estimate, Henri Becquerel discovered radioactivity. By the early 20th century, scientists realized that radioactive elements deep inside the Earth were constantly generating new heat, slowing down the cooling process dramatically. This meant Kelvin’s model was completely flawed.
In 1907, geophysicists using radiometric dating on ancient rocks determined that the Earth was actually 4.5 billion years old—over 100 times older than Kelvin’s most generous estimate.[1]
9 IBM Thought the World Would Only Need Five Computers
In 1943, Thomas Watson, then-chairman of IBM, reportedly made one of the most infamously bad predictions in tech history: that the world would only need maybe five computers. At the time, computers were massive machines used exclusively for military calculations and scientific research, occupying entire rooms with vacuum tubes and requiring specialized operators. The idea that companies, let alone individuals, would own a computer seemed completely absurd.
What Watson failed to predict was the explosive miniaturization of computing technology. Just a decade later, the invention of the transistor in 1947 allowed computers to shrink in size and cost. By the 1970s, companies like Apple and Microsoft were developing personal computers that could sit on a desk.
By the late 1990s, computers were essential household items, and today, there are over two billion personal computers worldwide, not to mention smartphones, tablets, and embedded computers in everyday objects. Whether or not Watson actually said the infamous quote, the underestimation of computing demand is one of the most spectacular miscalculations in technology history [2]
8 The Miscalculation That Almost Made Einstein Abandon Relativity
Albert Einstein’s general theory of relativity, published in 1915, revolutionized physics by explaining how gravity affects the fabric of space and time. However, when Einstein first worked through his equations, he realized something troubling: they predicted an expanding or contracting universe rather than a static one.
At the time, nearly all scientists believed the universe was eternal and unchanging. So Einstein, doubting himself, added a mathematical “fix”—a term called the cosmological constant (Λ)—to force the universe into a stable, unmoving state.
Then, in 1929, Edwin Hubble shattered the idea of a static universe by proving that galaxies were moving away from each other, meaning the universe was expanding. Einstein realized his error and allegedly called the cosmological constant his “biggest blunder.” He removed it from his equations, but ironically, decades later, physicists resurrected the cosmological constant to explain dark energy, the mysterious force driving the acceleration of the universe’s expansion.
Einstein’s initial miscalculation was actually more correct than he had realized. Still, because of the scientific consensus at the time, he had second-guessed himself and altered his work unnecessarily.[3]
7 The Ozone Layer Was Supposed to Take Centuries to Heal
In the 1980s, scientists discovered a gaping hole in the ozone layer over Antarctica caused by human-made chemicals called chlorofluorocarbons (CFCs). The ozone layer, which blocks harmful UV radiation, was being eroded faster than anyone had predicted. Early environmental models warned that if CFC use continued, the hole would worsen dramatically, increasing skin cancer rates and ecological disasters.
When the 1987 Montreal Protocol was signed to phase out CFCs, scientists still estimated that even with global cooperation, the ozone layer wouldn’t heal for centuries—if it ever did at all. By the early 2000s, however, satellite data showed something surprising: the ozone layer was recovering much faster than expected. The Antarctic ozone hole has been shrinking steadily due to the rapid decline in CFC emissions. By 2024, scientists estimated that it could return to pre-1980 levels by the 2060s.
The initial models had severely overestimated the long-term damage, underestimating the resilience of atmospheric chemistry and the speed at which global intervention could work. The miscalculation, while initially alarming, turned out to be one of the few instances where a scientific error resulted in unexpectedly good news.[4]
6 Early Climate Change Models Massively Underestimated Global Warming
Climate scientists began building computer models in the 1970s and early 1980s to predict how increasing CO₂ emissions would affect Earth’s climate. Many of these early models suggested that global warming would happen gradually over centuries, with plenty of time to adapt. The consensus was that natural feedback mechanisms, like cloud formation and ocean heat absorption, would moderate temperature increases and prevent runaway warming.
By the 2000s, it was clear that these early estimates were dangerously wrong. Instead of unfolding slowly, climate change accelerated far faster than expected, with record-breaking heat waves, faster ice melt, and extreme weather events happening decades earlier than predicted. In 2023 alone, global temperatures shattered previous records, with parts of the world experiencing heat indexes above 150°F (65°C)—levels considered centuries away in early climate models.
Some tipping points, like the collapse of the Greenland ice sheet, may now be irreversible. The underestimation of human-caused climate change’s speed and severity has led to delayed action, making it far harder to prevent catastrophic consequences in the near future.[5]
5 The Great Horse Manure Crisis That Never Happened
In the late 19th century, the world’s biggest cities relied on horse-drawn transportation, and with that came a major problem—mountains of manure piling up in the streets. Urban planners and public health experts estimated that by 1930, cities like New York and London would be buried under at least 9 feet (2.7 m) of horse manure, making large-scale urban life unsustainable.
Articles at the time warned of apocalyptic consequences, claiming that disease, filth, and unbearable stench would render cities unlivable. At the first International Urban Planning Conference in 1898, officials struggled to find any solution, believing the problem was too vast to fix. Then, in a twist no one expected, the internal combustion engine completely erased the problem. The invention of cars and public transit systems replaced horses so rapidly that by 1912, horse-drawn vehicles were already in sharp decline.
The once-imminent crisis vanished overnight, making all the doomsday predictions obsolete. Instead of drowning in horse manure, cities had to worry about traffic jams and smog instead. This was one of history’s greatest examples of a scientific miscalculation failing to account for technological disruption, proving that sometimes human ingenuity moves faster than anticipated disasters.[6]
4 The Internet Was Supposed to Be a Niche Tool
In 1995, astronomer Clifford Stoll wrote a now-infamous article for Newsweek, confidently declaring that the internet was overhyped and would never be widely used. He dismissed predictions about online shopping, e-books, and digital communities, stating that people would always prefer newspapers, in-person shopping, and traditional libraries. He even went as far as to say, “No online database will replace your daily newspaper.”
Many experts at the time agreed with him, believing the internet would remain a specialized tool for government agencies and researchers rather than becoming a mainstream technology for the public. Stoll’s prediction turned out to be one of the most spectacularly wrong takes in tech history.
By the early 2000s, Amazon had reshaped retail, Google had replaced physical libraries, and social media was rapidly taking over daily communication. By 2024, over five billion people use the internet regularly, and traditional newspapers and physical bookstores struggle to survive.
Stoll later admitted his mistake, calling it one of his biggest blunders, but his article remains a reminder that even the smartest people can completely misjudge the future.[7]
3 NASA’s Early Estimate of the Moon’s Surface Was Way Off
Before the Apollo 11 moon landing in 1969, scientists had wildly different ideas about what the lunar surface would be like. Some astronomers believed the moon was covered in a deep layer of fine, powdery dust, possibly making it impossible for astronauts or spacecraft to land safely. This fear was based on early telescope observations, which suggested that lunar craters were filled with soft, drifting material. Some even theorized that the moon’s surface could be a bottomless dust trap, swallowing anything that touched it.
NASA took this risk seriously, designing Apollo’s landing pads and astronaut boots to distribute weight as much as possible. But when Neil Armstrong and Buzz Aldrin stepped onto the surface, they found it was firm and stable, covered in only a thin layer of fine dust. The incorrect estimate was due to a misunderstanding of how micrometeorite impacts compacted the moon’s soil over billions of years.
Instead of sinking, Apollo astronauts had no trouble walking, and NASA never worried about the problem again.[8]
2 The Universe Was Supposed to Be Much Smaller
Before Edwin Hubble’s discoveries in the 1920s, astronomers believed the Milky Way was the entire universe. Many thought that mysterious “spiral nebulae” seen in telescopes were just small gas clouds within our own galaxy rather than distant galaxies of their own. In fact, the famous astronomer Harlow Shapley publicly argued that the universe was only about 100,000 light-years across, based entirely on the assumption that the Milky Way was all there was.
Everything changed when Hubble used the Mount Wilson Observatory to study the Andromeda Nebula, proving that it was far beyond the Milky Way, making it an entirely separate galaxy. His findings showed that the universe was millions of times larger than previously believed, with billions of galaxies stretching across unfathomable distances.
This completely shattered previous estimates, forcing scientists to rethink everything they knew about space. Today, we estimate the universe is at least 93 billion light-years across—a staggering increase from Shapley’s original 100,000-light-year estimate.[9]
1 The Human Genome Was Expected to Have Over 100,000 Genes
Before the Human Genome Project, geneticists estimated that humans had at least 100,000 genes coding for various proteins and bodily functions. This estimate was based on the assumption that a more complex organism must require significantly more genes than simpler life forms. Since bacteria had a few thousand genes, and fruit flies had around 14,000, scientists believed humans—being vastly more complex—must have six figures’ worth of genetic material to account for our intelligence and biological sophistication.
When the Human Genome Project was completed in 2003, researchers were stunned to find that humans only have about 20,000–25,000 genes—far fewer than anticipated. Not only do we have fewer genes than many expected, but we also have fewer genes than some simpler organisms, like certain plants and amphibians.
This shocking miscalculation forced scientists to rethink what truly makes humans unique, realizing that gene regulation, expression, and non-coding DNA play a much bigger role in biological complexity than the number of genes alone.[10]