Americans are becoming increasingly concerned about how their dietary choices impact their health. Sixty-four percent are paying more attention to nutritional recommendations than they were five years ago. They turn to doctors, food labels and government literature for help—but above all, they learn about nutrition from the media.
Why the confusion? Evidence suggests that widespread industry funding of nutrition studies leads to dubious findings: an estimated 90 percent provide outcomes that align with their funders' interests. When General Mills pays for a study that concludes eating sugary breakfast cereal reduces belly fat, readers have to stop and scratch our heads.
Meanwhile, the American Society for Nutrition, a prominent nonprofit research group, has financial ties to junk food giants including Kraft Foods, Hershey's, and PepsiCo.
Hurdles for Researchers
However, even when research is free of bias, nutritional epidemiology studies are challenging to execute. Compounds in food interact with one another and are difficult to isolate, so determining the exact effects of say, saturated fat, is much more complicated than it sounds.
Moreover, most nutrition studies analyze diets retrospectively, a method that can lead to error. When researchers ask participants questions about their eating habits and people self-report, they often lie about behavior or simply misremember.
But this type of research is relatively weak by scientific standards. A more trusted method, the double-blind controlled study, requires researchers to divide participants into groups and assign different diets to each.
For example, one group may avoid alcohol while another is instructed to drink one glass of wine per day. Over a period of months or years, researchers would measure health outcomes such as liver and heart function across both groups.
Controlled dietary studies are among the most difficult to execute. Researchers can't monitor subjects 24/7, and reporting is still an issue: in the alcohol example, adults who overindulge may be too embarrassed to admit they broke the rules.
The biggest barrier to obtaining conclusive evidence is the ethical dilemma inherent to controlled nutritional trials. If a study suggests that sugar consumption leads to diabetes, do we attempt to replicate it knowing that participants may be sick by the end?
Choosing What To Believe
Despite the challenges of nutritional research, people are generally trusting when it comes to new that validates their current behavior. Dietary decisions, in particular, become second nature, given that people reinforce them multiple times each day.
Consider a vegan who repeatedly avoids meat, eggs and dairy, each time reminding herself of the environmental, social or nutritional reasons she chose to go vegan in the first place. This constant reinforcement makes it easier for her to accept new research that validates the decisions she's already made.
In contrast, many Americans are skeptical about other types of science—especially politically charged topics such as climate change, vaccines and GMOs. Compared with food, most people have a limited personal interface with these topics, allowing them to easily write them off.
So how do people self-justify, accepting some studies so wholeheartedly while dismissing others?
For the science-literate, the choice of what to believe seems to be a rational one. This group, which includes researchers and laypeople who understand and respect the scientific process, tends to be skeptical of most research until it's replicated. They may also put more stock in trusted institutions such as Harvard and journals such as the science journal Nature, though even these studies can be revoked.
Other people rely on emotion and the influence of peers. Imagine that your sister, having recently learned that her son is autistic, is fearful, angry and searching for someone to blame. She launches an attack on vaccines after doing some reading online, furious that California parents are now required to vaccinate their children before enrolling them in public school. You're not a scientist—and your sister is really upset—so you figure she's probably onto something.
The internet makes this line of reasoning particularly dangerous. Whereas scientific journals and major news organizations previously served as gatekeepers for research, filtering out poorly executed studies, new findings are now shared online at an alarming rate. Often, context is limited and the information is published without commentary from experts in the field.
The result is that readers can track down obscure studies that seem to validate their preconceived notions of the world—that vaccines are dangerous, climate change is a hoax and Lucky Charms are healthy.
Challenging our own beliefs, nutritional or otherwise, causes cognitive dissonance, an uncomfortable state of mind we experience when grappling with two opposing ideas. Someone who eats sugary cereal for breakfast every day, upon learning that sugar may be making them fat, faces an internal conflict. Accept the study and give up Lucky Charms, or dismiss the study and keep enjoying a "magically delicious" breakfast.
The truth is, most people aren't willing to voluntarily undergo discomfort, even if it's just switching to a healthy green smoothie for breakfast if the potential benefit is small or far-off in time. They might accept new information if the research promises weight-loss by next week, but waiting months or years is daunting, so rejecting the evidence is easier.
Empirical thinkers, frustrated that so many cherry-pick scientific studies to reinforce their preconceived notions, may feel compelled to interfere. But attempts to sway non-believers with additional facts and figures seem to only alienate them further. Psychologists call this the "backfire effect," because people not only believe the first thing they hear but vigorously defend it in the face of correcting information.
When the scientific consensus becomes undeniable, public opinion tends to follow. After all, only a few people still think the earth is flat.