I Think You'll Find It's a Bit More Complicated Than That (2 page)

BOOK: I Think You'll Find It's a Bit More Complicated Than That
8.57Mb size Format: txt, pdf, ePub
ads

There is, however, some structure to this school reunion. In How SCIENCE WORKS we cover peer review, how research is unpicked and critiqued after publication, how we deal with contradictory research, the importance of methods and results being freely available, whether it matters who a researcher is, how cherry-picking harms science, and how myths are made when inconvenient results are ignored.

In BIOLOGISING we cover crass reductionism, including the peculiar beliefs that pain is only real when we scan see it on a brain scanner, that misery is best thought of as molecular, and that girls like pink ‘because they evolved to look for berries’. In STATISTICS we start with easy maths and accelerate painlessly to some fairly advanced notions. We cover why the odds of three siblings sharing a birthday is not 48,627,125 to 1, why spying on us all to spot the occasional terrorist is highly unlikely to work, how statistical tools for fraud helped catch Greece faking its national economic data, what you can tell from a change in abortion rates for Down’s syndrome, the many ways you can slice data to get the answer you want, the hazards of looking for spatial patterns on maps, and the most core statistical skill of all: how we can detect a true signal from everyday variation in background noise.

Then we go on to the glory of BIG DATA, the battles with government to get hold of it, the risks of sharing medical records with all and sundry, and the magical way that patterns emerge from the formless static of everyday life when you have huge numbers. In SURVEYS we learn the tricks of a sticky trade, and then we shift up a gear to cover EPIDEMIOLOGY, my day job, the science of spotting patterns in disease. Here we see how clever things called funnel plots can help to show whether one area’s healthcare really is any worse than another’s, whether an increase in antidepressant prescriptions really does mean more people are depressed (or even whether more people are taking antidepressants), and the core skill of all epidemiology: how to correct for ‘confounding variables’, or rather: how to make sure that apparent correlations in your data are real. In an overview of bicycle helmet research, we review every epidemiological error in the textbooks, and a grand claim about the benefits of screening for diseases helps show that doing something – even something small – can often be worse than doing nothing at all. We see why different study designs are needed to research common and rare diseases, and how frail memories can distort the findings, why we should never assume that laboratory tests are correlated with real patients’ suffering, and how simple blinded experiments can spot if a £70 wine magnetiser really does change the flavour.

In the section on BAD ACADEMIA, we see how whole fields have been undermined by the simple misuse of statistics. We find one simple statistical error made in half of all neuroscience papers, and, by using forensic methods, we can see that brain-imaging researchers must be up to no good, because collectively they are publishing far more positive findings than the overall numbers of participants in their research could possibly, plausibly, statistically, sustain. We see bad behaviour around journals retracting papers, and appallingly poor standards in animal research, alongside academic journals publishing wildly crass papers on how, for example, people with Down’s syndrome really are a bit like the Chinese.

In GOVERNMENT STATISTICS we see ludicrous overclaiming around public and private sector salaries (where commentators fail to compare like with like), Home Office figures on child abuse pulled almost from thin air, a government figure on the cost of piracy that assumes everyone in the country should be spending £9,700 each on DVDs and music every year, crime prevention numbers to support a national DNA database that simply do not add up, and a headline figure on local council overspending, from the Department for Communities and Local Government, whose derivation is so offensively stupid it almost defies belief. We also see there is no evidence that hosting events like the Olympics has any health benefit for the host nation.

EVIDENCE-BASED POLICY is a slightly different fish: is there really good evidence for the policies that governments choose? Here we see that the evidence supporting the redisorganisation of the NHS is weak and that the figures on poor performance in the NHS used to justify it are over a decade old, and when the minister tries to argue back, he digs a very deep hole. We see how a historic failure to run simple randomised trials on policy issues has left us ignorant on basic questions about what works, and then whizz through a few simple questions, showing how evidence can be checked for each one: is porn in sperm donor clinics a good idea, is organic food really better, is it wise for the Catholic Church to campaign against condoms, and are exams really getting easier? We see a thinktank report on maths, promoted by a TV maths professor, that gets its own maths catastrophically wrong, and a select committee misleading, and being misled. After all this carping, in a report for the Department For Education I set out how the teaching profession could have its own evidence-based practice revolution to mirror what we’ve seen in medicine (and review, along the way, how senior doctors as late as the 1970s fought back, to defend that favourite of the old and powerful: eminence-based medicine).

Recreational DRUGS are a magnet for bad policy, because ideology often conflicts with the evidence, so the temptation to distort the data is powerful. Here we see wildly inflated government figures for crop captures in Afghanistan (with a minister claiming that peasant farmers receive the entire street price from every £10 bag of heroin sold in London), and ask why death was quietly dropped from the government’s measures of drug-policy success, before an essay explaining why the UK prescribed heroin for heroin addicts from the 1920s onwards, why we stopped and why we should start again.

LIBEL is a subject close to my heart, having been through the process too many times. In this section, we see how the people who sue tend not to be very nice, and how their legal aggression can – to my great pleasure – backfire. This section also includes breast-enhancement cream, and the brief return of Gillian McKeith.

I’ve always railed against the idea that QUACKS are manipulators, with innocent victims for customers: one woman’s trip to intensive care presents an opportunity to see where the blame really lies, when quacks have their magical beliefs routinely reinforced by journalists and the government. More than that, we see how serious organisations – from universities to medicines regulators – can fail to uphold their own stated values when under political pressure or seduced by money. Then we have a brief interlude to look at three peculiarly enduring themes in modern culture: MAGIC BOXES of secret electronic components with supernatural powers (to detect bombs, cure cigarette addiction and even find murdered children), AIDS denialism (at the
Spectator
, of all places), and, in ELECTROSENSITIVITY, people eager to claim that electrical fields make you unwell (while selling you expensive equipment to protect yourself, and seducing journalists from broadsheets to the BBC’s
Panorama
).

If science is about the quest for truth, then equally important is the science of IRRATIONALITY – how and why our hunches get things wrong – because that’s the reason we need fair experiments and careful statistics in the first place. Here we see how our intuitions about whether a treatment works can be affected by the way the numbers are presented, how our outrage is lower when a criminal has more victims, why blind auditions can help combat sexism in orchestras, how people can turn their back on all of science when some evidence challenges just one of their prejudices, how people win more in a simple game when they’re told they’ve got a lucky ball, how responding to a smear can reinforce it, how smokers are misled by cigarette packaging, how people can convince themselves that patients in comas are communicating, and how negative beliefs can make people experience horrible side effects, even when they’re only taking sugar pills with no medicine in them. In this section I also unwisely disclose my own positive and creative visualisation ritual, and the evidence behind it.

In BAD JOURNALISM we see the many different ways that journalists can distort scientific findings: misrepresenting an MSc student’s dissertation project with a headline that claims scientists are blaming women for their own rape, creating vaccine scares, and saying that exercise makes you fat. We also see the techniques journalists use to mislead, by burying the caveats and failing to link to primary sources, then we review research showing that academic press releases are often to blame, and that crass reporting on suicide can create copy-cat behaviour. The work in this section has made me extremely unpopular with whole chunks of the media, but I truly don’t think there’s anything personal here: the pieces are simply straight explanations, illustrating how evidence has been misrepresented by professional people with huge public influence. In light of that, I’ve included some attacks on me by others, and you can make what you will of their backlash. Lastly, we see how hit TV science series
BRAINIAC
– which sells itself on doing truly dangerous, really ‘real’ science – simply fakes explosions with cheap stage effects.

In the final furlong, there’s a collection of STUFF: my affectionate introduction in the guidebook of a miniature steam railway that takes you through council estates to the foot of a nuclear power station, and a guide to stalking your girlfriend through her mobile phone (with permission). Lastly there are some EARLY SNARKS. Reading your own work from ten years ago is a bit like being tied down, with your eyelids glued open, and forced to watch ten-foot videos of yourself saying stupid things with bad hair. But in case you miss the child I once was, here I take pops at cosmetics companies selling ‘trionated particles’, do the maths on oxygenated water that would drown you before it did any good, and cry at finding
New Scientist
being taken in by some
obviously
fake artificial intelligence software.

So welcome, again, to my epidemiology and statistics toilet book. By the simple act of keeping this book next to the loo you will – I can guarantee it – develop a clear understanding of almost all the key issues in statistics and study design. Your knowledge will outdo that of many working scientists and doctors, trapped in the silo of their specialist subjects. You will be funny at parties and useful at work, and the trionated ink molecules embedded in every page will make you youthful, beautiful and politically astute.

I hope these small packages bring you satisfaction.

2014

HOW SCIENCE WORKS

Why Won’t Professor
Susan Greenfield Publish This Theory in a Scientific Journal?

Guardian
, 22 October 2011

This week Baroness Susan Greenfield, Professor of Pharmacology at Oxford, apparently
announced that computer games
are causing
dementia in children
. This would be very concerning scientific information; but it comes to us from the opening of a new wing at an expensive boarding school, not an academic conference. Then a spokesperson told a gaming site that’s
not really what she meant
. But they couldn’t say what she does mean.

Two months ago the same professor linked internet use with the
rise in autism diagnoses
(not for the first time),
then pulled back
when
autism charities
and an
Oxford professor of psychology
raised concerns. Similar claims go back a very long way.
They seem changeable
, but serious.

It’s with some trepidation that anyone writes about Professor Greenfield’s claims. When I raised concerns, she said I was like the epidemiologists who denied that smoking caused cancer. Other critics find themselves
derided in the media as sexist
. When Professor Dorothy Bishop raised concerns,
Professor Greenfield responded
: ‘It’s not really for Dorothy to comment on how I run my career.’

But I have one, humble, question: why, in over five years of appearing in the media raising these grave worries, has Professor Greenfield of Oxford University never simply published the claims in an academic paper?

A scientist with enduring concerns about a serious widespread risk would normally set out their concerns clearly, to other scientists, in a scientific paper, and for one simple reason. Science has authority, not because of white coats or titles, but because of precision and transparency: you explain your theory, set out your evidence, and reference the studies that support your case. Other scientists can then read it, see if you’ve fairly represented the evidence, and decide whether the methods of the papers you’ve cited really do produce results that meaningfully support your hypothesis.

Perhaps there are gaps in our knowledge? Great. The phrase ‘more research is needed’ has famously been banned by the
British Medical Journal
, because it’s uninformative: a scientific paper is the place to clearly describe the gaps in our knowledge, and specify new experiments that might resolve these uncertainties.

But the value of a scientific publication goes beyond this simple benefit of all relevant information appearing, unambiguously, in one place. It’s also a way to communicate your ideas to your scientific peers, and invite them to express an informed view.

In this regard, I don’t mean peer review, the ‘least-worst’ system settled on for deciding whether a paper is worth publishing, where other academics decide if it’s accurate, novel, and so on. This is often represented as some kind of policing system for truth, but in reality some dreadful nonsense gets published, and mercifully so: shaky material of some small value can be published into the buyer-beware professional literature of academic science; then the academic readers of this literature, who are trained to critically appraise a scientific case, can make their own judgement.

And it is this second stage of review by your peers – after publication – that is so important in science. If there are flaws in your case, responses can be written, as letters to the academic journal, or even whole new papers. If there is merit in your work, then new ideas and research will be triggered. That is the real process of science.

BOOK: I Think You'll Find It's a Bit More Complicated Than That
8.57Mb size Format: txt, pdf, ePub
ads

Other books

Makers by Cory Doctorow
Heart of Stone by Aislinn Kerry
Say That Again by Sasson, Gemini
Welcome Back, Stacey! by Ann M Martin
The World of Karl Pilkington by Pilkington, Karl, Merchant, Stephen, Gervais, Ricky
Craving the Highlander's Touch by Willingham, Michelle
A Sticky End by James Lear