I Fooled Millions Into Thinking Chocolate Helps Weight Loss. Here's How.

By John Bohannon

“Slim
 by Chocolate!” the headlines blared. A team of German researchers had 
found that people on a low-carb diet lost weight 10 percent faster if 
they ate a chocolate bar every day. It made the front page of Bild,
 Europe’s largest daily newspaper, just beneath their update about the 
Germanwings crash. From there, it ricocheted around the internet and 
beyond, making news in more than 20 countries and half a dozen 
languages. It was discussed on television news shows. It appeared in 
glossy print, most recently in the June issue of Shape magazine
 (“Why You Must Eat Chocolate Daily,” page 128). Not only does chocolate
 accelerate weight loss, the study found, but it leads to healthier 
cholesterol levels and overall increased well-being. The Bild 
story quotes the study’s lead author, Johannes Bohannon, Ph.D., research
 director of the Institute of Diet and Health: “The best part is you can
 buy chocolate everywhere.”
I
 am Johannes Bohannon, Ph.D. Well, actually my name is John, and I’m a 
journalist. I do have a Ph.D., but it’s in the molecular biology of 
bacteria, not humans. The Institute of Diet and Health? That’s nothing more than a website.
Other
 than those fibs, the study was 100 percent authentic. My colleagues and
 I recruited actual human subjects in Germany. We ran an actual clinical
 trial, with subjects randomly assigned to different diet regimes. And 
the statistically significant benefits of chocolate that we reported are
 based on the actual data. It was, in fact, a fairly typical study for 
the field of diet research. Which is to say: It was terrible science. 
The results are meaningless, and the health claims that the media 
blasted out to millions of people around the world are utterly 
unfounded.
Here’s how we did it.
The Setup
I
 got a call in December last year from a German television reporter 
named Peter Onneken. He and his collaborator Diana Löbl were working on a
 documentary film about the junk-science diet industry. They wanted me 
to help demonstrate just how easy it is to turn bad science into the big
 headlines behind diet fads. And Onneken wanted to do it gonzo style: 
Reveal the corruption of the diet research-media complex by taking part.

The call wasn’t a complete surprise. The year before, I had run a sting operation for Science on fee-charging open access journals,
 a fast-growing and lucrative new sector of the academic publishing 
business. To find out how many of those publishers are keeping their 
promise of doing rigorous peer review, I submitted ridiculously flawed 
papers and counted how many rejected them. (Answer: fewer than half.)
Onneken
 and Löbl had everything lined up: a few thousand Euros to recruit 
research subjects, a German doctor to run the study, and a statistician 
friend to massage the data. Onneken heard about my journal sting and 
figured that I would know how to pull it all together and get it 
published. The only problem was time: The film was scheduled to be aired
 on German and French television in the late spring (it premieres next week), so we really only had a couple of months to pull this off.
Could
 we get something published? Probably. But beyond that? I thought it was
 sure to fizzle. We science journalists like to think of ourselves as 
more clever than the average hack. After all, we have to understand 
arcane scientific research well enough to explain it. And for reporters 
who don’t have science chops, as soon as they tapped outside sources for
 their stories—really anyone with a science degree, let alone an actual 
nutrition scientist—they would discover that the study was laughably 
flimsy. Not to mention that a Google search yielded no trace of Johannes
 Bohannon or his alleged institute. Reporters on the health science beat
 were going to smell this a mile away. But I didn’t want to sound 
pessimistic. “Let’s see how far we can take this,” I said.

The Con
Onneken
 and Löbl wasted no time. They used Facebook to recruit subjects around 
Frankfurt, offering 150 Euros to anyone willing to go on a diet for 3 
weeks. They made it clear that this was part of a documentary film about
 dieting, but they didn’t give more detail. On a cold January morning, 5
 men and 11 women showed up, aged 19 to 67.
Gunter
 Frank, a general practitioner in on the prank, ran the clinical trial. 
Onneken had pulled him in after reading a popular book Frank wrote 
railing against dietary pseudoscience. Testing bitter chocolate as a 
dietary supplement was his idea. When I asked him why, Frank said it was
 a favorite of the “whole food” fanatics. “Bitter chocolate tastes bad, 
therefore it must be good for you,” he said. “It’s like a religion.”

After
 a round of questionnaires and blood tests to ensure that no one had 
eating disorders, diabetes, or other illnesses that might endanger them,
 Frank randomly assigned the subjects to one of three diet groups. One 
group followed a low-carbohydrate diet. Another followed the same 
low-carb diet plus a daily 1.5 oz. bar of dark chocolate. And the rest, a
 control group, were instructed to make no changes to their current 
diet. They weighed themselves each morning for 21 days, and the study 
finished with a final round of questionnaires and blood tests.
Onneken
 then turned to his friend Alex Droste-Haars, a financial analyst, to 
crunch the numbers. One beer-fueled weekend later and... jackpot! Both 
of the treatment groups lost about 5 pounds over the course of the 
study, while the control group’s average body weight fluctuated up and 
down around zero. But the people on the low-carb diet plus chocolate? 
They lost weight 10 percent faster. Not only was that difference 
statistically significant, but the chocolate group had better 
cholesterol readings and higher scores on the well-being survey.
The Hook
I know what you’re thinking. The study did show accelerated weight loss in the chocolate group—shouldn’t we trust it? Isn’t that how science works?

Here’s
 a dirty little science secret: If you measure a large number of things 
about a small number of people, you are almost guaranteed to get a 
“statistically significant” result. Our study included 18 different 
measurements—weight, cholesterol, sodium, blood protein levels, sleep 
quality, well-being, etc.—from 15 people. (One subject was dropped.) 
That study design is a recipe for false positives.
Think
 of the measurements as lottery tickets. Each one has a small chance of 
paying off in the form of a “significant” result that we can spin a 
story around and sell to the media. The more tickets you buy, the more 
likely you are to win. We didn’t know exactly what would pan out—the 
headline could have been that chocolate improves sleep or lowers blood 
pressure—but we knew our chances of getting at least one “statistically 
significant” result were pretty good.
Whenever you hear that phrase, it means that some result has a small p value. The letter p
 seems to have totemic power, but it’s just a way to gauge the 
signal-to-noise ratio in the data. The conventional cutoff for being 
“significant” is 0.05, which means that there is just a 5 percent chance
 that your result is a random fluctuation. The more lottery tickets, the
 better your chances of getting a false positive. So how many tickets do
 you need to buy?
P(winning) = 1 - (1 - p)n
With our 18 measurements, we had a 60% chance of getting some“significant” result with p < 0.05. (The measurements weren’t independent, so it could be even higher.) The game was stacked in our favor.
It’s called p-hacking—fiddling with your experimental design and data to push p under 0.05—and it’s a big problem.
 Most scientists are honest and do it unconsciously. They get negative 
results, convince themselves they goofed, and repeat the experiment 
until it “works.” Or they drop “outlier” data points.

But
 even if we had been careful to avoid p-hacking, our study was doomed by
 the tiny number of subjects, which amplifies the effects of 
uncontrolled factors. Just to take one example: A woman’s weight can 
fluctuate as much as 5 pounds over the course of her menstrual cycle, 
far greater than the weight difference between our chocolate and 
low-carb groups. Which is why you need to use a large number of people, 
and balance age and gender across treatment groups. (We didn’t bother.)
You
 might as well read tea leaves as try to interpret our results. 
Chocolate may be a weight loss accelerator, or it could be the opposite.
 You can’t even trust the weight loss that our non-chocolate low-carb 
group experienced versus control. Who knows what the handful of people 
in the control group were eating? We didn’t even ask them.
Luckily, scientists are getting wise to these problems. Some journals are trying to phase out p
 value significance testing altogether to nudge scientists into better 
habits. And almost no one takes studies with fewer than 30 subjects 
seriously anymore. Editors of reputable journals reject them out of hand
 before sending them to peer reviewers. But there are plenty of journals
 that care more about money than reputation.
The Inside Man
It
 was time to share our scientific breakthrough with the world. We needed
 to get our study published pronto, but since it was such bad science, 
we needed to skip peer review altogether. Conveniently, there are lists 
of fake journal publishers. (This is my list, and here’s another.)
 Since time was tight, I simultaneously submitted our paper—“Chocolate 
with high cocoa content as a weight-loss accelerator”—to 20 journals. 
Then we crossed our fingers and waited.

Our
 paper was accepted for publication by multiple journals within 24 
hours. Needless to say, we faced no peer review at all. The eager suitor
 we ultimately chose was the the International Archives of Medicine.
 It used to be run by the giant publisher BioMedCentral, but recently 
changed hands. The new publisher’s CEO, Carlos Vasquez, emailed Johannes
 to let him know that we had produced an “outstanding manuscript,” and 
that for just 600 Euros it “could be accepted directly in our premier 
journal.”
Although the Archives’ editor claims that “all articles submitted to the journal are reviewed in a rigorous way,” our paper was published less than 2 weeks after Onneken’s credit card was charged. Not a single word was changed.
The Marks
With
 the paper out, it was time to make some noise. I called a friend of a 
friend who works in scientific PR. She walked me through some of the 
dirty tricks for grabbing headlines. It was eerie to hear the other side
 of something I experience every day.
The
 key is to exploit journalists’ incredible laziness. If you lay out the 
information just right, you can shape the story that emerges in the 
media almost like you were writing those stories yourself. In fact, 
that’s literally what you’re doing, since many reporters just copied and
 pasted our text.
Take a look at the press release
 I cooked up. It has everything. In reporter lingo: a sexy lede, a clear
 nut graf, some punchy quotes, and a kicker. And there’s no need to even
 read the scientific paper because the key details are already boiled 
down. I took special care to keep it accurate. Rather than tricking 
journalists, the goal was to lure them with a completely typical press 
release about a research paper. (Of course, what’s missing is the number
 of subjects and the minuscule weight differences between the groups.)

But
 a good press release isn’t enough. Reporters are also hungry for “art,”
 something pretty to show their readers. So Onneken and Löbl shot some 
promotional video clips and commissioned freelance artists to write an 
acoustic ballad and even a rap about chocolate and weight loss. (It 
turns out you can hire people on the internet to do nearly anything.)
Onneken
 wrote a German press release and reached out directly to German media 
outlets. The promise of an “exclusive” story is very tempting, even if 
it’s fake. Then he blasted the German press release out on wire service 
based in Austria, and the English one went out on NewsWire. There was no
 quality control. That was left to the reporters.
I felt a queazy mixture of pride and disgust as our lure zinged out into the world.
The Score
We landed big fish before we even knew they were biting. Bild rushed their story out—“Those who eat chocolate stay slim!”—without contacting me at all. Soon we were in the Daily Star, the Irish Examiner, Cosmopolitan’s German website, the Times of India, both the German and Indian site of the Huffington Post, and even television news in Texas and an Australian morning talk show.
When
 reporters contacted me at all, they asked perfunctory questions. “Why 
do you think chocolate accelerates weight loss? Do you have any advice 
for our readers?” Almost no one asked how many subjects we tested, and 
no one reported that number. Not a single reporter seems to have 
contacted an outside researcher. None is quoted.

These 
publications, though many command large audiences, are not exactly 
paragons of journalistic virtue. So it’s not surprising that they would 
simply grab a bit of digital chum for the headline, harvest the 
pageviews, and move on. But even the supposedly rigorous outlets that 
picked the study up failed to spot the holes.
Shape magazine’s
 reporting on our study—turn to page 128 in the June issue—employed the 
services of a fact-checker, but it was just as lackadaisical. All the 
checker did was run a couple of sentences by me for accuracy and check 
the spelling of my name. The coverage went so far as to specify the 
appropriate cocoa content for weight-loss-inducing chocolate (81 
percent) and even mentioned two specific brands (“available in grocery 
stores and at amazon.com”).
Some dodged the bullet. A reporter from Men’s Health
 interviewed me by email, asking the same sort of non-probing questions.
 She said that the story was slated for their September issue, so we’ll 
never know.
But
 most disappointing? No one dipped into our buffet of chocolate music 
videos. Instead, they used vaguely pornographic images of women eating 
chocolate. Perhaps this music will take on a life of its own now that 
the truth is out:
The Knock
So why should you care? People who are desperate for reliable information face a bewildering array of diet guidance—salt is bad, salt is good, protein is good, protein is bad, fat is bad, fat is good—that
 changes like the weather. But science will figure it out, right? Now 
that we’re calling obesity an epidemic, funding will flow to the best 
scientists and all of this noise will die down, leaving us with clear 
answers to the causes and treatments.

Or
 maybe not. Even the well-funded, serious research into weight-loss 
science is confusing and inconclusive, laments Peter Attia, a surgeon 
who cofounded a nonprofit called the Nutrition Science Initiative. For example, the Women’s Health Initiative—one
 of the largest of its kind—yielded few clear insights about diet and 
health. “The results were just confusing,” says Attia. “They spent $1 
billion and couldn’t even prove that a low-fat diet is better or worse.”
 Attia’s nonprofit is trying to raise $190 million
 to answer these fundamental questions. But it’s hard to focus attention
 on the science of obesity, he says. “There’s just so much noise.”
You can thank people like me for that. We journalists have to feed the daily news beast, and diet science is our horn of plenty. Readers just can’t get enough stories about the benefits of red wine or the dangers of fructose. Not only is it universally relevant—it pertains to decisions we all make at least three times a day—but it’s science! We don’t even have to leave home to do any reporting. We just dip our cups into the daily stream of scientific press releases flowing through our inboxes. Tack on a snappy stock photo and you’re done.
You can thank people like me for that. We journalists have to feed the daily news beast, and diet science is our horn of plenty. Readers just can’t get enough stories about the benefits of red wine or the dangers of fructose. Not only is it universally relevant—it pertains to decisions we all make at least three times a day—but it’s science! We don’t even have to leave home to do any reporting. We just dip our cups into the daily stream of scientific press releases flowing through our inboxes. Tack on a snappy stock photo and you’re done.

The only problem with the diet science beat is that it’s science.
 You have to know how to read a scientific paper—and actually bother to 
do it. For far too long, the people who cover this beat have treated it 
like gossip, echoing whatever they find in press releases. Hopefully our
 little experiment will make reporters and readers alike more skeptical.
If a study doesn’t even list how many people took part in it, or makes a bold diet claim that’s “statistically significant” but doesn’t say how big the effect size is, you should wonder why. But for the most part, we don’t. Which is a pity, because journalists are becoming the de facto peer review system. And when we fail, the world is awash in junk science.
If a study doesn’t even list how many people took part in it, or makes a bold diet claim that’s “statistically significant” but doesn’t say how big the effect size is, you should wonder why. But for the most part, we don’t. Which is a pity, because journalists are becoming the de facto peer review system. And when we fail, the world is awash in junk science.
There
 was one glint of hope in this tragicomedy. While the reporters just 
regurgitated our “findings,” many readers were thoughtful and skeptical.
 In the online comments, they posed questions that the reporters should 
have asked.
“Why are calories not counted on any of the individuals?” asked a reader on a bodybuilding forum.
 “The domain [for the Institute of Diet and Health web site] was 
registered at the beginning of March, and dozens of blogs and news 
magazines (see Google) spread this study without knowing what or who 
stands behind it,” said a reader beneath the story in Focus, one of Germany’s leading online magazines. 
Or as one prescient reader of the 4 April story in the Daily Express put it, “Every day is April Fool’s in nutrition.”
Or as one prescient reader of the 4 April story in the Daily Express put it, “Every day is April Fool’s in nutrition.”
* Correction:
 The study referenced by Peter Attia was called the Women’s Health 
Initiative, not the Women’s Health Study, and it was one of the largest 
of its kind, not the largest. Also, when it was published, this
 article erroneously featured a screenshot showing the Daily Mail’s 
coverage of a chocolate study, but not the one discussed in this story. 
The day after publication, we replaced it with a screenshot of the Daily
 Mail’s actual coverage of the study.
* Update: The paper has been removed from the International Archives of Medicine website, but you can read it here.
Um comentário:
Unfortunatelly, it's easy to publish anything you want in scientific publications. And General Press doesn't check before sharing these data.
Postar um comentário