KEY POINTS:
Flicking through a day's newspapers often feels like tackling a numerical assault course.
"Young people who use sunbeds increase their risk of skin cancer by 75 per cent."
"Ninety-six per cent of children in European orphanages are not orphans."
"In the UK we throw away 4.4 million apples a year."
These are just three examples from recent papers.
Number-crunching has always had the potential to bamboozle, and today, more than ever, is the age of the fraction, the percentage and the average (but is that a mean or a median?).
It's not just the newspapers, either. Numbers crop up in adverts, health warnings and speeches made by politicians, too. But do the figures add up? And do we trust them?
Not really, according to, yes, another set of stats that dropped into the Independent's inbox last month.
A survey in Britain by the Office for National Statistics found that only 36 per cent of people asked thought that official figures were "generally accurate". Meanwhile, a 2007 poll of trust in government statistics by the European Commission ranked Britain 27th out of 27 countries.
Last week, a statistics watchdog was set up to tackle this apparent crisis in confidence. The UK Statistics Authority has the job of ensuring statistics are correct and free from government spin. Every day, its website will provide links to the raw data backing up government statistics, and the authority will, warns its chairman, Sir Michael Scholar, "name and shame" ministers who spin them beyond recognition.
"It's vital that statistics aren't altered to tell the story somebody wants to tell," Scholar says.
Kevin McConway, a senior statistics lecturer at the Open University, says statistical abuse damages his profession. "Statistics in the UK are actually pretty reliable but more and more often you see surveys that mean nothing, data that looks important but isn't, or statistics that are just made up.
It destroys public trust in all statistics."
To name and shame some of the worst offenders, the Independent has trawled the archives for classic examples of "junk statistics", from the poorly worded reports to deliberate massaging of official figures, and asked McConway to read between the lines.
"Over 40 per cent of families spend eight hours or more a week together"
Commissioning a scholarly survey or study is a popular choice for companies who want to get their names in the papers. And disappointing results needn't get in the way of a bit of PR.
Last month, the family holiday firm Center Parcs sent out a release designed to counter the image of the British family in terminal decline.
The headline read: "The family: It's not toxic, it's thriving". And the best stat Center Parcs could muster to back up its claim? "With over 40 per cent of families spending eight hours or more a week together ... a new study suggests that, actually, families like each other and want to spend time together."
Is that really what it suggests? If 40 per cent of families spend eight hours or more together, that must mean 60 per cent (or the majority) spend fewer than eight hours a week together. And eight hours a week is equivalent to 68.6 minutes a day. Put it that way and the figures hardly endorse Center Parcs' vision of the "thriving" family.
McConway's verdict: "There's often this thought that 'oh it's numbers so it must be right', but often it's nonsense, especially when a company cherry-picks results and plays down the rest. It's okay, as long as we are aware of it and get enough information to work out the real statistics."
"Commissioning a scholarly survey or A sausage a day increases the risk of bowel cancer by a fifth"
Last month, research circulated by the World Cancer Research Fund suggested that eating 50g of processed meat a day, equivalent to one sausage, increases the likelihood of bowel cancer by a fifth, or 20 per cent. It sounds worrying. After all, a 100 per cent risk would mean you are guaranteed to catch cancer, and that figure of 20 per cent doesn't seem far off. But the reality is neither as simple, nor as scary, as that.
Research shows that, out of every 100 people, around five will develop bowel cancer within their lifetime. So what impact does eating sausages really have? If you take 100 people who eat 50g of processed meat a day, the amount of cancer will rise by a fifth from five in 100 to six in 100. So, to 99 of the 100 porkers, eating all those sausages will cause no difference at all. But, of course, that seems far less shocking than the headline figure of a 20 per cent rise.
McConway's verdict: "You barely go a week without seeing examples in the papers of stats appearing to indicate a significant increase. Twenty per cent sounds big, but it's only an increase on a small percentage - 20 per cent on next to nothing is still next to nothing."
"Speed cameras cause a 35 per cent decrease in deaths and serious injuries"
In 2003, the then Transport Secretary, Alistair Darling, issued a press release that read: "Deaths and serious injuries fell by 35 per cent on roads where speed cameras have been in operation." Darling went on to say: "The report clearly shows speed cameras are working ... This means that more lives can be saved and more injuries avoided."
The suggestion that cameras caused the drop in accidents got Darling in trouble. Figures go up and down all the time. Contentious issues get more coverage when numbers are high. So the Government does something about it. The numbers, having peaked, then go down. Naturally, the Government takes credit for the fall. Challenged to prove the link in the speed-camera case, ministers revised their claim.
McConway's verdict: "This happens all the time. The statistical jargon is 'regression to the mean': over time, figures that peak or trough will, on average, head towards the middle, or mean. There are ways to take this effect into account when producing stats like these, but it does not always happen."
"The number of American children gunned down has doubled every year since 1950"
Sometimes junk statistics are caused simply by lazy wording. Perhaps the best (worst) example came in a prospective PhD student's dissertation, published in 1995. It appeared in the first chapter of Damned Lies and Statistics by Joel Best, who called it "the worst social statistic ever".
It read: "Every year since 1950, the number of American children gunned down has doubled."
Really? Let's do the maths. Say only one child was gunned down in 1950. According to our student, that number would have doubled every year, so two dead in 1951, four in 1952, eight in 1953 ...that makes 1024 in 1960, and so on. By 1995, the year of the report, more than - wait for it - 35 trillion children were gunned down.
It turns out that the student had taken the figure from a government report, which stated: "The number of children killed each year by guns has doubled since 1950." So the figure had doubled over 45 years, not every year. By garbling his words, the student came up with a wildly inaccurate statistic.
McConway's verdict: "This case is terrible, but sometimes even statisticians get it wrong."
"Falling coconuts kill 150 people a year"
In 2002, in an article about the uprooting of coconut trees by lawsuit-wary Australian officials, the Daily Telegraph reported: "Coconuts ... kill about 150 people worldwide each year, making them more dangerous than sharks." The figure appeared again in a press release issued by a travel insurance firm assuring holidaymakers they would be covered, should they be struck by a coconut.
The reports suggested the figure of 150 came from a Canadian professor but his paper on coconut injuries did not posit a death toll. Attempts to trace the origin of the figure have failed.
The case echoes a similar legend - the belief that we should drink eight glasses of water a day. University of Pennsylvania researchers recently searched for the source. Their conclusion: "It is unclear where this recommendation came from."
McConway's verdict: "The truth is simply that we like hard figures, especially when they make a great story."
- INDEPENDENT