How a simple nudge can improve health and nutrition reporting.
“Researchers test vaccine they hope could stem Alzheimer’s.”
“Do fatty foods decrease serotonin levels?”
“Researchers claim McDonald’s fries can cure baldness.”
What do these three headlines, and many others like them, have in common? They were stories about studies conducted on mice.
Now, there’s nothing particularly unusual about research that’s conducted on mice and not on humans. It’s safer and less expensive — a core part of the scientific process. And while for animal welfare reasons it’ll be great news when we find a better way to do invasive experiments, many of these experiments aren’t even hard on the mice.
But overhyping preliminary results is a real problem in science reporting. In drug development, it’s a very long road from a promising mouse study to clinical trials in humans. In health and nutrition reporting, there’s still a lot of uncertainty about how much mouse findings are applicable to humans.
Yet articles often lead with an exciting finding — a new chemical that treats cancer, a new diet that extends life spans — and mention only after many paragraphs that the study was conducted on mice, not on people.
In April, behavioral scientist James Heathers decided to do something about it — something ridiculously simple. He made a Twitter account called “Just Says In Mice.” As promised, the Twitter account just retweets science articles, adding “IN MICE.”
The Twitter account was an instant sensation; @justsaysinmice now has nearly 60,000 Twitter followers, many of whom bring bad science to Heathers’s attention.
The tweets are pretty funny, but there’s a serious mission at the core.
“I am perfectly prepared to judge your outlet, out loud and in public,” Heathers states in his mouse mission statement, “if you say ‘patients’ when you mean ‘genetically modified mice,’ likewise ‘women’ or ‘men,’ likewise when you say ‘cancer’ (as if that was a single condition anyway) when you mean ‘cancer in a specific mouse model,’ likewise when you say ‘obesity’ when you mean ‘fat mice.’ ”
He argues that misleading the public like this has real consequences. “When it comes to reporting on science, I feel like this attitude is slowly, imperceptibly, contributing to an erosion of trust, because we’re continually betraying what’s we’ve done and what’s possible to do.”
And his little mouse account is well-positioned to fix this. Journalists are disproportionately on Twitter — it’s a major tool for networking, sharing stories, and contacting sources. Endless think pieces have been written about whether journalists are too beholden to Twitter, and debated at length on, well, Twitter.
So @justsaysinmice isn’t just making fun of bad reporting — it is also highly visible to almost all the science reporters who provide fodder for it. And there are some signs that they’re changing their behavior.
Some of them outright say so. Freelance science writer Vicky Forster tweeted Monday:
— Dr Vicky Forster (@vickyyyf) June 11, 2019
Or, as science writer Kara Gavin said last month:
— Kara Gavin (@karag) May 20, 2019
When I covered efforts to make vaccines that don’t need to be refrigerated last week, I had @justsaysinmice on the mind: The mice were only needed to test that the vaccines — which have been tested extensively in humans — hadn’t lost any potency, so the experiment’s relevance to humans was easier to establish. Nonetheless, I spent a paragraph explaining this when I otherwise might not have.
Heathers writes of his mouse account that he has been flooded with examples of mouse study reporting, both good and bad:
I continually get mouse-related communication now. No complaints, obviously I signed up for that.
However, it’s giving me the distinct impression that @justsaysinmice is… well, it’s working. Mice seem to be appearing in headlines, ledes, and tweets at a higher rate.
— James Heathers (@jamesheathers) June 1, 2019
It’s easy for him to be wrong about that (as he acknowledges) — after all, being at the center of the national mouse-reporting conversation will definitely give you a skewed impression of the state of science reporting. But it’s pretty plausible to me that he’s right. The behavior change he’s asking for is pretty simple: mention clearly, and upfront, that the research was conducted in mice.
And (since there are now so many people scouring the internet for bad mouse reporting) the consequences if you forget are very predictable: a two-word public scolding from a popular science account. That seems like a set of conditions that might produce real improvements.
So next time you see an article that’s clear that the research it’s reporting on was conducted on mice, you might want to thank this odd, well-intentioned little social experiment. And next time you see an article that isn’t upfront about its mice, let Heathers know. It’s for science.
Sign up for the Future Perfect newsletter. Twice a week, you’ll get a roundup of ideas and solutions for tackling our biggest challenges: improving public health, decreasing human and animal suffering, easing catastrophic risks, and — to put it simply — getting better at doing good