Show Mobile Navigation
           
Our World |

Top 10 Scientific Theories That Were Certain (And Then Weren’t)

by A.C. Lura
fact checked by Jamie Frater

We often think of science as absolute. Something either is or it isn’t and through scientific study we have proof to back up that stark black and white. The reality though, is that the universe as measured through science is much more complex and unusual. Sometimes the facts just refuse to be consistent at all. Sometimes the results of one study directly or indirectly contradict the findings of another. This is a collection of such scientific contradictions.

Top 10 Ridiculously Common Science Myths

This is probably also a good time to remind people of the fallacy of argumentum ad populum – the theory that because a majority believes something, it is true. “Scientific consensus” is often used as a kind of proof for something, when it is not at all. In fact, when a person uses the “consensus” argument you should scrutinize everything they say as potentially questionable, because a person using one fallacy to “prove” a point is very possibly clinging to many other fallacies also.

10 Beer: Health Food and Poison


There’s nothing we love more than to hear our vices are actually virtues. What else causes more sensational headlines than hearing that science has proven beer is ultimately good for us and that we should drink more of it? Thankfully, this isn’t just limited to headlines. There is very real science that suggests various health benefits linked to the world’s favorite hoppy drink.

One such example was published by the International Journal of Endocrinology where evidence of a link between silicon and bone strength was recorded. The theory presented was that silicon dioxide (SiO2) helps with the body’s ability to calcify. Whatever the exact method, rats with a healthy supply of silicon had improved calcium incorporation in their bones compared to rats that were deficient in silicon. Silicon is found in such foods as grains, cereals, green beans, and of course—beer. In short, beer makes bones stronger.[1]

Aside from Silicon there are other useful chemical properties to beer. A paper published in Mutation Research/Fundamental and Molecular Mechanisms of Mutagenesis outlines the effects of the chemical Xanthohumol, which is also present in beer. What makes this worthwhile is that Xanthohumol has demonstratively been shown to protect the liver and colon from cancer causing[2] mutagens found in cooked food.[3] In other words, beer helps fight cancer.

Other studies of beer found evidence that moderate amounts stop inflammation,[4] helps prevent kidney stones,[5] and the same silicon from beer also helps fend off Alzheimer’s Disease.[6] One would think beer is the perfect health food, except–

In 2018 an exhaustive study was published about the effects of alcohol on the health of individuals and entire populations. It included 500 collaborators assisting from 40 different nations and 694 different sources of data covering huge swaths of Earth’s total population. What this study concluded was that despite the limited health benefits discussed above, 3 million people died from Alcohol related health problems in 2016 alone. In fact, for males between the ages of 15-49 Alcohol was responsible for 12 percent of all deaths. If you take all people together as one group, Alcohol was the 7th largest cause of death in the world.[7]

The data was summarized by it’s senior author, Dr. Emmanuela Gakidou, who said “The health risks associated with alcohol are massive. Our findings are consistent with other recent research, which found clear and convincing correlations between drinking and premature death, cancer, and cardiovascular problems.”

What amount of alcohol would be safe? She concluded, “Zero alcohol consumption minimizes the overall risk of health loss.” In other words, there is no amount of alcohol that will not increase your chance of premature death.[8]

9 Coffee Will Both Give and Protect you from Glaucoma


A research study published in the Journal of Agricultural and Food Chemistry showed the positive effect of one of the main components in raw coffee –chlorogenic acid– shields eyes against the retinal degeneration caused by glaucoma, aging or diabetes. This protection can slow down eyesight deterioration and even blindness. To discover this, researchers exposes mice’s eyes to nitric oxide, which causes retinal degeneration, but mice treated with chlorogenic acid suffered no ill effects, unlike the mice with no prior treatment.[9]

The chair of the American Osteopathic Association, Dr. Robert Bittel said of this study, “As with any study that cites commonly used food items as therapeutic in some way, caution has to be taken so that the public understands the negative as well as the positive potential implications of drinking coffee.”[10]

While coffee is shown to protect against the effects of glaucoma, it also has the dubious side effect of increasing the likelihood of developing the disease in the first place among certain groups of people. Thankfully, for most of us the increase is not statistically significant, but one study published in Graefe’s Archive for Clinical and Experimental Ophthalmology showed that for those who have already developed glaucoma the intake of coffee made the condition worse.[11] Yet another set of research showed that woman with a family history of Glaucoma (but have not yet developed it themselves) had an increased risk of getting the disease if they were coffee drinkers.[12]

For some people then, coffee is both the poison and the cure.


8 Stretching Before Exercise Either Hurts or Does Nothing To Performance


For years stretching before exercise was a given. Everyone acknowledged the benefits and it was even taught as part of physical education in schools, however there was little to no research found to back up the supposed benefits. When research started being conducted on the matter though, the common knowledge was upended. For example, one study had two groups of trained athletes run one mile on three different occasions. One group performed a series of 6 different lower body static stretches and the other group sat without stretching. What they found was that the group that did not perform the stretches finished their mile significantly faster (about half a minute sooner on average, compared to the stretching group). The study concluded, “Study findings indicate that static stretching decreases performance in short endurance bouts…Coaches and athletes may be at risk for decreased performance after a static stretching bout. Therefore, static stretching should be avoided before a short endurance bout.”[13]

Meanwhile, another study published in Medicine & Science in Sports & Exercise sought to learn the effects of stretching on more than just running. This study had 20 participants who did a well rounded series of stretches and warm ups that included seven lower body and two upper body regions alongside a control group. Afterward these athletes were put through a battery of testing exercises that measured for flexibility, run times, vertical jumps, and even ability to pivot directions. The study concluded that the stretching had no measurable impact on the athletic ability of any of the participants.

Interestingly enough though, it did have a mental effect. Those who stretched believed it would improve their performance significantly compared to if they hadn’t stretched, but besides boosting their confidence the stretches did no such thing. According to this study then, we can still stretch to feel good, but don’t expect it to give you any edge.[14]

7 Picking Your Nose Is Harmful, Eating Your Boogers Is Healthy


Though widely considered a disgusting habit, more of us pick our noses than we might like to admit. A quick survey among 200 Indian teenagers showed that literally all of them commonly participated in rhinotillexomania (the medical word for nose picking).[15] But this is concerning for more reasons than just social protocol. A study published in Infection Control & Hospital Epidemiology tested and questioned 238 healthy patients and 86 hospital employees about their nose picking habits. The test showed that frequent nose pickers had an increased presence of the dangerous bacteria Staphylococcus aureus in their nasal passages.

Though about 30% of the population carry the Staph bacteria with them, usually with no ill effects, if a wound[16] allows the bacteria into the body it can cause potentially fatal infections.[17] This study shows nose picking is bad for you, because it increases the chances at having one of these dangerous infections.

But what if we don’t just stop with picking our nose? Imagine if we put what we find to good use. One study, titled “Salivary Mucins Protect Surfaces from Colonization by Cariogenic Bacteria” showed the positive impact of mucins throughout the body. This mucus helps, among other things, to protect the surface of our teeth from the myriad of attacking bacteria that strive to dismantle them. Where in our body is a healthy supply of salivary mucins? In our dried nasal mucus aka. Boogers.[18] Not only does this mucus protect our teeth if eaten, but there’s also evidence to suggest that this may help prevent respiratory infection, stomach ulcers, and HIV.[19]

An Australian lung specialist named Friedrich Bischinger commented on the findings of this study and said, “In terms of the immune system, the nose is a filter in which a great deal of bacteria are collected, and when this mixture arrives in the intestines, it works just like a medicine.”

Whether or not these benefits outweigh the increased risk of staph infections is up to you, but the author of the mucus eating article also suggests that we could make an artificial version of the salivary mucus to get similar benefits.[20] Eventually we could have our boogers and eat them too.


6 Chocolate is a Miracle Food That Ruins Your Health


Chocolate is a world favorite food with some 72 million metric tons of it being consumed every year.[21] It’s no surprise then how well studied this food is and how often we’ll hear reports about its medical benefits. When browsing scientific papers there seem to be no end to the health benefits of chocolate.

Some studies that have examined this sweet treat concluded that it can help fend off cardiometabolic disorders and cardiovascular diseases,[22] improves cognitive functions in older adults,[23] lower blood pressure,[24] and even protect your skin from UV-induced erythema.[25]

One study showed chocolate as slowing colon cancer in rats![26] Chocolate has dozens of minor health benefits to enjoy.

However, alongside chocolate’s many tiny health benefits there are substantial consequences to eating chocolate. Most notably could be its high sugar and fat content that leads to obesity. One study concluded that for postmenopausal woman, every 1oz of chocolate eaten a week increased weight gain by 1kg over a period of 3 years. The more chocolate consumed, the higher 3 year weight gain.[27] This is concerning because obesity can result in diabetes, hypertension, heart disease, respiratory disorders, cancer, cerebrovascular disease, stroke, and many other conditions.[28]

Meanwhile, rats may avoid colon cancer by eating chocolate, but in humans chocolate eating seems to be linked with higher chances of prostate cancer.[29] Like so many foods, chocolate is both good for us and bad for us in different ways.

Alice H. Lichtenstein, professor of nutrition science and policy at Tufts University in Boston summed it up nicely when she said, “If you enjoy chocolate the important thing to do is choose the type you enjoy the most and eat it in moderation because you like it, not because you think it is good for you.”[30]

10 Facts We All Get Wrong About Colors

5 Self Control Can and Can’t Be Depleted


Ego depletion is a concept in psychology that has been widely understood and tested. This theory suggests that self control is a resource that can be stockpiled as well as depleted.[31] The study that first suggested this theory had students participant in multiple tasks that required self control. First they were shown two foods, radishes and chocolate cookies. The paper says, the “Chocolate chip cookies were baked in the room in a small oven, and, as a result, the laboratory was filled with the delicious aroma of fresh chocolate and baking.” Some participants were instructed to eat only the radishes, some were told to eat only the cookies, and another group was shown no food at all. The group told to eat only radishes had to exercise self control so as not to eat the cookies. Afterward, the subjects were given an impossible puzzle to solve (but they weren’t told that) and given a bell to signal to the researchers if and when they wanted to give up. It would require self control to continue working on a task with no positive results.

Ultimately the study showed that the group that first had to exercise self control by eating only radishes and not the tasty looking cookies also gave up sooner on the impossible puzzle. The conclusion reached was that their self control had been depleted slightly by the first test and that resulted in them having less self control to use in the second.[32] The same concept had been replicated with different factors. Some labs found self control could be reduced by forcing people to make purchasing decisions or talking racial politics with someone of a different race. Some labs even tested for Ego depletion in dogs and found it.[33]

On the other hand, a more recent study choose to definitively test ego depletion and sought a task that would require self control that wasn’t affected by things like personal taste (after all, what if someone hates Chocolate chip cookies?) or culture. It involved 24 labs from Australia, Belgium, Canada, France, Germany, Indonesia, the Netherlands, New Zealand, Sweden, Switzerland, and the United States. Rather than cookies and radishes it involved computerized tasks that required self control. Namely they had participants play digital games that required quick responses, but the answers weren’t immediately obvious. The participants had to control their impulses and instead find the correct answer. This study found that there was no significant depletion of performance from task to task associated with self control.[34]


4 Red Meat is Unhealthy. Maybe. We’re Not Sure


A rack of ribs during a summer barbecue or a hotdog during a spring baseball game. Red meat is a staple in many a diet and to some its taste elevates it above other foods. There’s something truly satisfying about a perfectly cooked and seasoned steak, but our reverence for red meat has historically been met with caution from the scientific community. Studies have shown that processed red meats like hotdogs[35] increase the risk of glioma, a tumor that occurs in the brain and spinal cord.[36] Still other findings showed increased risk of colorectal cancer from eating red meat.[37] Yet another negative of the food is the accumulation of trimethylamine N-oxide[38] which is a cause of heart disease.[39] All of this led to most health groups recommending we limit our intake of red meats, especially the processed variety.

However a recent and controversial meta-analysis study (a scientific study that is actually an examination of many different studies on the topic and comes to a conclusion based on all of the results put together)[40] published in Annals of Internal Medicine argued that there was not enough scientific evidence to support the recommendations to eat less red meat. Based on this meta-analysis they concluded, “the certainty of evidence for the potential adverse health outcomes associated with meat consumption was low to very low” and “there was a very small and often trivial absolute risk reduction based on a realistic decrease of 3 servings of red or processed meat per week”.[41] Their conclusion was not that red meat was healthy, but that there was not yet significant enough proof that it was harmful to suggest limiting red meat intake for health reasons.

3 Video Games Improves or Impairs Children’s Social Skills


For gamers it is an age old frustration to hear that “video games rot your brain”. This sentiment has followed the industry since the first wave of video games hit the arcades, but while it was well recited by some there was no evidence to back it up. With time the scientific community was able to turn their attention to video games and study the effects on children for themselves. One paper published in the Social Psychiatry and Psychiatric Epidemiology studied children ages 6—11. The study measured each child’s video game play time per day and compared it to data gathered from questionnaires given to their parents, teachers, and the children themselves. They also looked at each child’s academic performance. After certain factors were accounted for they found that higher video game play time among the children was associated with 1.75 times the odds of high intellectual functioning and 1.88 times the odds of high overall school competence, as well as less relationship problems with their peers.[42]

Katherine M. Keyes, PhD, the assistant professor of Epidemiology at the Mailman School of Public Health commented on the results of this study, saying, “Video game playing is often a collaborative leisure time activity for school-aged children. These results indicate that children who frequently play video games may be socially cohesive with peers and integrated into the school community. We caution against over interpretation, however, as setting limits on screen usage remains and important component of parental responsibility as an overall strategy for student success.”[43]

And she was wise to suggest parents still be involved in limiting children’s screen time, because another study also examined the effect of video games on children, but this study examined their lives over the course of six years, starting when they were six years old.[44] 873 Norwegian school children were involved and every two years their parents reported how much time was spent video gaming and teachers evaluated the children’s social competence using such factors as how well they followed directions, controlled their behavior, and showed confidence in social situations.

The results showed that poor social performance was linked to an increase of video gaming in the future, but that video gaming itself did not lower social skills in the future, except for one notable group. 10 year old girls in the study that had high levels of video game play time were found to be less socially competent at 12 years old then the girls who didn’t have high video game play time.[45] While video games are shown to be helpful for most, it isn’t a universal truth. For some it increased social skills, but for others it depletes them.


2 Early Rising Is a Blessing and a Bane


“The early bird gets the worm” is a common idiom often heard from the mouth’s of early risers, some waking up and being productive before the sun. What they do, they do for good reason, as one survey has shown. Published in the Journal of Applied Social Psychology one study questioned 367 university students about their sleeping habits and proactivity. It included such statements for them to agree or disagree with such as “I spend time identifying long-range goals for myself” and “I feel in charge of making things happen.”

Ultimately the survey found that, “Morning people were more proactive than evening types, and people with small differences in rise time between weekdays and free days were also more proactive persons.”[46] The author of the survey said, “When it comes to business success, morning people hold the important cards. My earlier research showed that they tend to get better grades in school, which get them into better colleges, which then lead to better job opportunities. Morning people also anticipate problems and try to minimize them, my survey showed. They’re proactive. A number of studies have linked this trait, proactivity, with better job performance, greater career success, and higher wages.”[47]

On the other side of the coin a study published in The Journal of Clinical Endocrinology & Meetabolism examined 447 men and women between the ages of 30—54, who worked at least 25 hours outside their home and found found that their early rising did not line up with their natural circadian rhythm.[48]

Patricia M. Wong, MS, from the University of Pittsburgh said about this study, “Social jetlag refers to the mismatch between an individual’s biological circadian rhythm and their socially imposed sleep schedules. Other researchers have found that social jetlag relates to obesity and some indicators of cardiovascular function. However, this is the first study to extend upon that work and show that even among healthy, working adults who experience a less extreme range of mismatches in their sleep schedule, social jetlag can contribute to metabolic problems. These metabolic changes can contribute to the development of obesity, diabetes and cardiovascular disease.”[49]

1 Eating Eggs Does and Doesn’t Contribute to Cardiovascular Disease


Eggs are a dietary staple for much of the world. In fact 73% of adults are considered whole egg consumers.[50] Naturally then, the effects eggs have on our health is a topic of concern for many and so has been a topic of study for the scientific community. The trouble is the information hasn’t been entirely conclusive.

Historically one of the major health complaints against eggs is the 185 milligrams of cholesterol in the yolk. Certain forms of cholesterol are shown to increase the risk of heart disease[51] and science seems to support this concern. A 2019 study that monitored participants over a 17.5 year period found that each additional half egg consumed by an adult per day increased the likelihood of cardiovascular disease by 6% and even increased general mortality rate by 8%.[52][53]

Confoundingly though the science is not always consistent. One the same subject during the same year a separate study examined the effect of eggs on the likelihood of cardiovascular disease and it found no statistically significant link.[54] An author of the study, Maria Luz Fernandez, professor of nutritional sciences at the University of Connecticut, described eggs as having high cholesterol, but low saturated fat. The point being was, she said, “While the cholesterol in eggs is much higher than in meat and other animal products, saturated fat increases blood cholesterol. This has been demonstrated by lots of studies for many years,”

In other words, the cholesterol in eggs may not be the killer we thought.

“There are systems in place so that, for most people, dietary cholesterol isn’t a problem,” said research associate professor of nutritional sciences at Tufts University, Elizabeth Johnson.[55]

10 Sex Myths We All Believe

fact checked by Jamie Frater

35 Shares
Share32
Tweet
WhatsApp
Pin3
Share