The BBC must improve how it reports statistics

The BBC has a unique position in British society, with a reputation for fairness, impartiality, and usefulness. But to maintain these characteristics, journalists need to be much more careful about how they scrutinise and present statistics to the public, write Amy Hawkins and Phoebe Arnold.

Image credit: Kyle Cheung CC BY-NC-ND

How much does the UK contribute to the EU each week? How tired did you get of hearing that question, and of the inaccurate answer that it’s £350 million? Even if you didn’t watch the debates or read the op-eds, it was hard to miss the pictures of Boris Johnson and other high-profile Vote Leave campaigners standing in front of a big red bus with the inaccurate £350 million statistic emblazoned across the side.

Misleading claims supported by murky statistics were used on both sides of the EU referendum debate. But the £350 million claim became the iconic slogan of the Leave campaign, and helped to show why the BBC needs to be braver in challenging statistical assertions if it is to be a useful public service.

It’s encouraging to see the recommendations in this report seeking to address the concerns raised in our evidence. We were pleased to see many of our recommendations cited in the Trust’s report. The report says the BBC’s editorial guidelines should include a section on use of statistics. In this section the BBC could help journalists by obliging them to take more notice of what the UK’s statistics watchdog is saying about politicians’ use of statistics. The guidelines should include a clause saying that when the UK Statistics Authority identifies a claim as false, this should be pointed out by journalists when their interviewee attempts to make that claim on a BBC platform.

Even though some journalists did point out that the UK Statistics Authority had shown the £350m figure to be false, Ipsos MORI found that half of the public thought the claim was correct. If there had been a rule obliging journalists to make clear when a figure had been called out by the official independent regulator, many voters might have gone to the polls with a more accurate view of our relationship to the EU—regardless of the outcome.

Last year the Joseph Rowntree Foundation funded Full Fact to produce a report: “How did the Media and Politicians in the UK Discuss Poverty in 2015?” One of the most striking findings was that inaccurate claims often get a free ride on the airwaves. A presenter fails to challenge a claim that’s been prominently called out, or leaves that responsibility to the opposing interviewee (who may also be wrong).

There’s a compelling example of this in the BBC Trust’s report:

In an interview on the Today programme on the Government’s proposals to cut working tax credits on 26 October 2015, Matthew Hancock, MP (a government spokesperson) made a claim that “. . . by next year eight out of ten people will be better off as a result [of the Government’s proposal]. . .”[…] This claim had been heavily refuted by organisations such as the Institute for Fiscal Studies, who concluded that it was “arithmetically impossible” for losses in tax credit cuts to be mitigated by increases in the NMW. Rather than challenging this statistical claim, however, the presenter on the Today programme simply stated, “Well, we will hear the opposition view on that from Labour and the Liberal Democrats later in the programme.

In order for journalists to effectively challenge claims, they first need to be able to effectively scrutinise them. And sometimes they need help with this.

Another highlight of the report is that the BBC needs to increase its own statistical capacity, especially for out-of-hours journalists. The report’s authors were “surprised that an organisation the size of the BBC, with a high (and increasing) volume of statistics in its outputs, does not itself have an explicitly expert statistical capability.”

We were not so surprised. During the general election last year, we had journalists from a flagship morning programme calling us in the middle of the night. They needed help analysing the statistics in a report that had been dumped on them as part of the next morning’s news, and they didn’t have anyone else to go to.

We were more than happy to help. But the BBC is orders of magnitude bigger than Full Fact, and not every journalist on every show knows about us (yet). A system that relies on personal contacts and ad hoc advice is not one that is sustainable for an organisation the size and influence of the BBC — especially in a public debate which is increasingly dominated by statistics and big data.

The BBC’s complaints procedure is another area that relies too heavily on personal relationships. Its Complaints Framework specifies that “the complaints process should be quick and simple.” We have long found it to be anything but. Back in 2013, we asked for a correction to an inaccurate BBC News article reporting on the social care gap. The Daily Mail corrected the online version of its article within four hours; the BBC took 50 days.

Public debate moves quickly, and this kind of sluggishness massively dilutes the impact a correction can have on the public’s understanding of the issue at hand. That’s why we now get corrections by contacting individual journalists, using relationships built up over a number of months and years. This obviously isn’t an option for the majority of the people, nor should it need to be.

Finally, if the BBC is to be impartial, journalists need to be much more careful about covering unpublished research. If a programme’s audience can’t check where the figures in a news item have come from, how can we be expected to place trust in its impartiality?

In March 2014, the BBC announced that Panorama would be discussing the “Great NHS Robbery”: an investigation of fraud and error across the NHS, saying that this cost £7 billion per year — a claim the NHS disputed. The research was not published until late in the afternoon on the day of the broadcast. When it was published, it became apparent that there was no £7 billion figure, but there was hardly any time to correct the analysis.

Many people are, not unfairly, numbed by the endless slinging of conflicting statistics across a panel by people on opposite sides of a debate. Our work at Full Fact is to demonstrate what these numbers mean, what they can show you, and what they can’t. But BBC journalists should be doing this too. The research that went into the BBC Trust’s report found that:

…many news programmes – particularly in the evening bulletins – allow[ed] competing sources to make statistical references, with no journalists or independent experts making a judgement about the veracity or credibility of either claim.

This leaves the BBC, and therefore its audiences, wanting. During the EU referendum, we repeatedly heard people saying that they just wanted ‘the facts’. If they couldn’t get them at the BBC, whose TV programmes are where most people get their news, what hope is there?

In six months’ time…

The BBC Trust says it will hold a six month review to see how well certain programmes are doing at holding people in public office to account. In the first six months the focus will be on the Today programme, Breakfast and on 5 Live Drive. We’ll be gathering our own evidence to see whether they’re following the BBC Trust’s recommendations successfully.

The BBC has a unique position in British society, with a reputation for fairness, impartiality, and usefulness — it must be held to this standard, especially for statistics, if this reputation is to be well-founded.

This article represents the views of the author and not the position of the Democratic Audit blog, or of the LSE. It was first published on openDemocracy.

About the Authors

Amy Hawkins is executive assistant to the director of Full Fact.

Phoebe Arnold is Senior Communications Officer at Full Fact.

Similar Posts