Last time, I wrote a post about how difficult it is to do good research on nutrition and health (Cereal killer: Is eating breakfast the new smoking?). A couple of weeks later, as the pizzagate unfolds, we painfully learn more about these intricacies slice by slice. At the center of attention is Brian Wansink, who “is Professor and Director of the famed Cornell University Food and Brand Lab, where he is a leading expert in changing eating behavior“. If you have missed the start of the controversy and feel like you need to catch up on the full narrative, I have linked a good summary by Andrew Gelman. Briefly, Wansink wrote a post on his blog. He provided the career advice to never say no to your supervisor’s proposals because this is how you will get tenure by publishing numerous papers. Even if you have a dataset at hand that does not yield the expected result, you can torture it for a while until it finally surrenders and provides one or more significant results. Now, all it takes is little more deep diving into the data and a little pinch of wild story-telling and there you go: you have successfully inflated your list of publications. Treated in this do-or-die way, every study turns into science equivalent of the bottomless soup bowl that Wansink became famous for.
Well it didn’t come as a big surprise to many when these publications turned out to be full of numerical and statistical errors (or “inconsistencies”) in the end, but the total count of 150 errors was still stunning. Even more errors in other papers have been unearthed lately and it might go on like this for a while. Being in the spotlight is not always a pleasure, but I guess if you can’t stand the heat, you should get out of the lab kitchen. Clearly, Wansink’s take on exploratory field research has all the unhealthy ingredients for a beautiful garden of forking paths. It would be too simplistic though to conclude that this is just one misled nutrition researcher, who “aggressively disseminate[s his] findings” (in his own words).
Social science is not chemistry. Psychology is not physics. But is it more than science fiction?
As part of his defense, Wansink argued in an interview with retraction watch that “Social science isn’t definitive like chemistry.” Sure, but maybe we should try to get as close too definitive as we can? This statement reminded me of another interview published by slate last week. If you haven’t stumbled upon this post, I strongly advise you to read it to better understand the many ugly faces of the replication crisis. It dissects some noisy, but well-publicized research on dance moves conducted by Nick Neave, a psychologist of Northumbria University. What struck me here is a similar kind of social science apologies:
I called up Neave two years ago to find out how and why this happened. “It seems to have a life of its own,” he told me then. “We’re not reporting a cure for cancer. It’s not that important, but it’s fun!”
Again, I asked if it wouldn’t be a wise idea to wait, at least until he’d confirmed his findings with a fresh, hypothesis-driven experiment.
“As I said, we’re psychologists,” he said. “We haven’t wrongly reported a cure for cancer or anything like that. You know, it’s not that important, is it really?”
I was forced to agree.
“We’ve reported some interesting stuff,” he continued. “We don’t fully understand everything that we’ve found. … We’ve never, ever stood up and said, ‘We have proven this.’ You can’t say that in psychology. It’s not physics. So we have these situations where you report something, and you hope that it’s true, but it might not be true. Then you change your mind; you tweak things around. That’s the nature of the game.”
Bottom line: research in nutrition or psychology is supposed to deliver some “interesting” and “fun” stories to the tabloid press? I think everyone who feels the same way should let his or her funding agencies know immediately. We should save the taxpayers’ money for different kinds of research I believe. Maybe the press will pay for future studies like this. What really bugs me about these statements by Wansink and Neave is their apologetic conception of what social science can only be at best: “exploratory”, which is mistaken for blatant exploitation of the noise inherent in the data, and inspired by “surprising” novel results that more often than not turn out to be the mere result of overfitting a statistical model due to poor control of total error rates. Everyone who has been turned down by fancy journals because poorly conducted studies have claimed the novelty but cannot be replicated certainly knows how damaging this can be for a field at a bigger scale.
Is this the new beginning?
Mindless publishing based on data mining of a “failed study” is in no way smarter than mindless eating from a bottomless bowl. Policy should not have to be informed by bad science sexed up for public outreach. I am happy to see that Brian Wansink has publicly committed himself to new lab procedures that should improve reproducibility of his research in the long run. Nevertheless, I feel that we all have to strive to make social science a little bit more like chemistry and physics when it comes to experimental rigor. Precisely because good research in nutrition and psychology is so hard, we’ll have to work even harder to do reasonably well. If this is not for you, please do research on chemistry instead of cutting corners.