Lies, damned lies and statistics: why reporters must handle data with care
Are statistics used responsibly by reporters? Stephen Cushion and Justin Lewis from Cardiff University discuss the role faulty information can play in politics and public affairs, using the example of 2016 EU Referendum.
During the 2016 EU referendum campaign, both sides used statistics pretty freely to back their arguments. Understandably, UK broadcasters felt compelled to balance competing perspectives, giving audiences the opportunity to hear the relative merits of leaving or remaining in the EU. In doing so, however, the truth of these statistical claims was not always properly tested.
This might help explain some of the public’s misconceptions about EU membership. So, for example, although independent sources repeatedly challenged the Leave campaign’s claim that the UK government spent £350m per week on EU membership, an IPSOS MORI survey found that almost half of respondents believed this was true just days before the election.
More generally, statistics play an increasingly prominent role in debates about politics and public affairs. Think tanks, government agencies, independent researchers, academics and other information-rich sources routinely produce data that can inform public debate and policy making.
Most people do not read raw data sets or the methodology behind them – they primarily rely on how news media interpret statistical information. This puts considerable pressure on the journalists who do the reporting – in today’s 24-hour news culture they have limited time to understand how statistics are produced and communicate their broader meaning.
The challenge of reporting statistics impartially was recently recognised by the BBC Trust – the body that regulates BBC journalism. It commissioned a report, Making Sense of Statistics, which was published in August 2016. Our comprehensive content analysis of a range of BBC UK broadcast and online news media formed part of the review and we have since developed our analysis in an article published in Journalism Practice.
Statistics about statistics in the news
Of the 6,916 news items examined in our research, more than 20% featured a statistic. Most of these statistical references were fairly vague, with little or limited context or explanation. Overall, only a third provided some context or made use of comparative data.
Online news, we found, presented statistics with the most clarity and, perhaps surprisingly given the possibilities of a visual medium, TV was no more likely to use statistics or provide more context than radio.
Statistics featured mostly in stories about business, the economy, politics and health. So, for example, three-quarters of all economics items featured at least one statistic, compared to almost half of news about business. But there were some areas – where statistics might play a useful role in communicating trends or levels of risk – that statistics were rarely used.
For example, although there is plenty of data about crime or terrorism that may help provide useful context to audiences, stories on these topics tended not to provide any statistical context. Just 6.1% of crime news items, for instance, made any reference to statistics – an area where public understanding of real world trends and levels of risk is notoriously poor.
The most common source of statistics used in news stories were politicians (20.6%), businesses and government departments or agencies (both 12.3%). Other information sources, such as NGOs (7.3%), academics (6.5%), regulatory bodies (4.4%) and think tanks (3.8%) were far less prominent. Nearly three-quarters of party political statistical references – 72.6% – came from Conservative politicians, with Labour – the official opposition – making up only 18.4%.
Sources of references to statistics. University of Cardiff, Author provided
The heavy reliance on government sources for statistical information raises important questions about impartiality and holding power to account.
Statistics can challenge post-truth politics
We might expect cabinet ministers and government departments to be a dominant source of statistical information – after all, civil servants regularly supply them with data. But we also identified a lack of clarity in the communication of government statistics, since most appeared in a relatively vague or imprecise form. We also found the government’s statistical claims were often reported without being subject to independent scrutiny.
While it would clearly be unrealistic to expect journalists to challenge every statistical reference they encountered, in only 4.2% cases were statistical claims challenged. In other words, the vast majority of statistics reported in the news media were not verified either by journalists or external sources.
When challenges were made, they often came from opposition politicians, creating tit-for-tat exchange which often added more heat than light. So, for example, when the UK government proposed a major cut to tax credits, it claimed the impact would be mitigated by other factors such as an increase in the minimum wage. Rival parties disagreed, which led to some broadcasters simply pitting the government’s statistical claims against opposition voices – leaving audiences none the wiser.
Reports that tried to establish broader truths were more helpful. In these cases journalists used the wealth of independent data or expertise available to scrutinise government claims, allowing audiences to appreciate where the statistical consensus actually lay.
End of experts?
In a TV debate during the 2016 European referendum, prominent Leave campaigner, Michael Gove, declared that “people in this country have had enough of experts”. But in the so-called era of post-truth politics, now more than ever we need a greater reliance on independent expertise to move beyond the spin or outright lies of politicians.
Our study was conducted before the main EU referendum campaign and the report was published after the vote. This was a pity. Many of the report’s main recommendations – that journalists should be bolder in challenging questionable statistical claims, that audiences find tit-for-tat exchanges about statistical claims unhelpful and confusing, that journalists use expertise to establish broad truths – would have usefully informed broadcast coverage of the referendum campaign.
Of course, this is not always straightforward. Some issues are highly contentious, with little consensus between experts. But there are many areas – from climate change to the broad economic impact of leaving the single market – where there is a considerable weight of evidence on one side. Objectivity and impartiality are very different journalistic ideals. Our analysis suggests a little more of the former and less reliance on the latter might enhance people’s understanding of politics and public affairs.
This article was originally published on The Conversation. Read the original article.