Can we all become statistical myth-busters?

Making stats meaningful to media and the public; experts show how at the Cambridge Science Festival

Rebecca-Goldin-CSF-main.jpg
Dr. Rebecca Goldin shares some statistical bloopers in the media at her public talk "Numbers in the News" at the Cambridge Science Festival. (Photo © Trevor Butterworth, Sense about Science USA)

Editor’s note: This month, we have been exploring the theme of “diversity of mind” – through the lens of the renowned animal scientist and autism advocate Temple Grandin, who sheds light on the different kinds of minds science needs, for example, and Elsevier’s accessibility specialists, who work to ensure that our content is accessible to people of all abilities – and disabilities. Science also needs to be welcomed and understood by a diverse range of people in society – not just scientists but the people whose lives and health are affected by the work of researchers and scholars. Researchers have a role to play in helping the public – and the journalists who write about their work – understand the statistics in their research papers. That was the subject of two presentations at the Cambridge Science Festival this week: “From Bench to Broadcast” and “Numbers in the News.” Here's more from Julia Wilson of Sense about Science, an organization we partner with that challenges the misrepresentation of science and evidence in public life.


Julia Wilson, Director of Operations and Development at Sense about ScienceIn my years at Sense about Science, I have come to think, almost without realizing it, that there are two types of myth-busting minds.

There are those with an eye for dodgy scientific claims; they hone in on promises that don’t add up and can spot common tricks used to pull the wool over our eyes. These people ask sceptical questions: “Is that plausible? That sounds too good to be true!”

Then there are those who dive straight into the statistics, asking questions of error bars, confidence intervals and significance. (Yes, they often have PhDs.) They are the ones who take a news story about the latest miracle cure or health hazard and say, “Hang on a minute! The confidence interval is this – so that means the range of the true answer lies in as much as this or as little as that!” Suddenly the headline grabbing story is not quite so headline-worthy.

Here's a good example of the second type. When Sir David Spiegelhalter, Professor for the Public Understanding of Risk in the Statistical Laboratory of the University of Cambridge, saw the headline that pollution was killing 40,000 people a year in the UK, he got straight into the figures to find out if it was true. In an article for Medium.com, he wrote:

But where does the 40,000 figure come from, what does it mean, and is there really a ‘crisis’? I discovered that digging down to the basis for this figure required some statistical detective work.

He took a forensic look at the figures – where did they come from, what estimates were made and what were the uncertainties? Confidence intervals and “plausibility distribution” meant an estimate of 29,000 deaths used in the calculation could actually be between 5,000 and 60,000. It’s a very detailed analysis – a lot of which I struggle to grasp to be honest – but what is clear is that this 40,000 figure is based on a lot of uncertainty and assumption.

Can anyone grasp stats? And what can experts do to help?

I’m certainly in the first camp of myth-busting minds. People like me think more about the plausibility of claims. We don’t jump for the stats. I think it is fair to say the people who can dig down into the figures behind a story are usually the experts and have particularly mathematical minds. But is it possible for us all to become better at looking at numbers? Can someone like me take on more of a statistical approach to myth-busting? And can the experts do a better job at helping us, and journalists and policymakers, understand statistics and know what to look out for?

To answer these questions, Sense about Science just held a series of sessions for early-career researchers and the public in collaboration with Elsevier and Elsevier's Cell Press at the Cambridge Science Festival in Massachusetts. Ann Gabriel, Elsevier’s VP for Academic & Research Relations, pointed out that this event aligns with her team’s efforts to engage with leading institutions and academic leadership to promote discussion and developments in open science, research integrity, and research data management and applications. “We were delighted to partner with Sense about Science in supporting this event in the US following their legacy of important leadership and reporting in the UK, including the value of peer review,” she said

Dr. Rebecca Goldin presents "Numbers in the News" to a public audience at the Cambridge Science Festival (Photo © Trevor Butterworth)

With more than a decade of experience helping journalists make sense of statistics, Dr. Rebecca Goldin, Director of STATS and Professor of Mathematical Sciences at George Mason University in Washington, DC, knows a lot about statistical pitfalls and the challenges of communicating statistics to non-experts like me. I asked her whether she thought someone with a mind like mine can get to grips with statistics:

It is entirely possible for us all to get better at understanding statistics. Sometimes a few good questions can help us wrap our heads around a study's conclusions. A piece of advice I would give to people who struggle with understanding percentages and risks is to ask, ‘What does this mean for 1,000 people?’ If you translate an increase or decrease in risk to actual numbers of people affected, it is much easier to grasp.

At the workshop, she gave an animated account of the common misconceptions that many journalists have about statistics, and encouraged the researchers in the audience to communicate the figures in their papers with clarity.

I asked her what she wished scientists would stop doing when it comes to communicating statistics:

The absolute top thing I would say to scientists when it comes to communicating statistics would be to stop implying causality. It varies of course from field to field, but far too many research papers describe causal links when really only a correlation has been found. The other thing I would say is stop just giving a P-value as evidence from your research. We need to know about the effect size, what is actually clinically meaningful and why.

"From Bench to Broadcast"

Trevor Butterworth, Dr. Emilie Marcus and Dr. Christopher Labos share examples of statistical descriptions they’ve seen in papers, press releases and public reports of research.

A panel debate called “From Bench to Broadcast” followed, with Dr. Emilie Marcus, CEO of Cell Press and Editor-in-Chief of Cell; Trevor Butterworth, Executive Director of Sense about Science USA; and Dr. Christopher Labos, a cardiologist at McGill University in Montreal. They took the researchers through some of the best (and worst) examples of statistical descriptions they’ve seen in papers, press releases and public reports of research, and shared their own difficulties with understanding and communicating statistical concepts.

Dr. Marcus talked about the journal editor’s role in ensuring that press releases are statistically sound, but also the importance of transparency and clarity at every point in the communication chain:

Responsibility for communicating statistics in research relies on rigor at every stage, starting with the researcher through to the editor, press officer and journalist.

Tracey Brown, Director of Sense about Science, takes questions from the audience. (Photo © Trevor Butterworth)

Scientific publishers and journals clearly play a vital role in ensuring researchers are communicating the statistics in their research clearly and accurately. It’s great to see journals like Cell communicating this message to the next generation of researchers. And it is fantastic that statisticians like Dr. Goldin are going on the road and sharing statistical concepts with journalists and the wider public.

So although we might not be able to all become statistical experts, with clear communication from the research community, it should be possible for us to value the importance of statistics and what they can uncover. There are some simple concepts to grasp, such as sample size, risk and significance, that can give us an idea of what the numbers really mean and how reliable they are. And that’s incredibly useful. Without this way of thinking, there is a risk we will be uninformed and make poor decisions about our health, the environment, our education and how we lead our lives.

Am I convinced that we can get all journalists to read study reports like statisticians – to spot flaws in methods or understand distribution and error bars? No. Nor can we all become full-time statisticians.

But with backing from researchers and journal editors, we can start asking more questions about the numbers in reports and getting a better grasp of what they mean. And that is hugely valuable to society.

Elsevier and Sense about Science

Sense About Science (SaS)Since 2006, Elsevier has partnered with Sense about Science (SAS), an independent charity that challenges the misrepresentation of science and evidence in public life. This unique partnership program of events and publications works to promote an understanding of peer review among journalists, policymakers and the public and engage and inspire early-career researchers to stand up for science in public debates around the world. Read more.

Diversity of mind at Elsevier

Empowering Unconventional KnowledgeIt takes all kinds of minds to help the world understand science. That’s why we support diversity of mind at Elsevier, employing people with different thinking styles to develop our tools and technologies – and supporting initiatives that enable people of diverse abilities, cultures and nationalities to access our research, understand it and act on it. By championing diversity and the unconventional, we empower people to go beyond the obvious, inspiring new opportunities for science and society. For more stories about people and projects empowered by knowledge, we invite you to visit Empowering Knowledge.

Tags


Contributors


Comments


comments powered by Disqus