David Cameron misreads statistics again


David Cameron the mask slips


No way. Just… no way. There are many things I’ve been meaning to blog about for ages; things that are not David Cameron. But do I ever catch a break?

Turns out Sir Andrew Dilnot, chair of the UK Statistics Authority, has issued a letter refuting the Prime Minister’s claims yet again. This time, it is over his affirmation, in his Conservative Party conference speech, that Britain “is paying down its debts”.


Thank you to Dr. Eoin Clarke for sharing!

On leading/ loaded questions and response bias (or: David Cameron wants to know my views on immigration)


I swear, I stated this blog to write about the practice of research, about how we can sociologically understand the world and to promote the latest interesting studies. NOT to pick on David Cameron.

Yet, just a week after writing about Voodoo Polls, while I was peacefully checking my Facebook…

facebook feedBlimey, so David Cameron wants me to click on his survey and tell him how I feel about immigration.

Now, kids, what was I saying last time?

Continue reading

On “voodoo polls” and why we shouldn’t ever use them


So, after writing about David Cameron’s shoddy use of statistics in the Telegraph and now the margin of error in one YouGov poll that might just have changed the fate of the referendum and what happens after, I found myself thinking a lot about how official statistics and opinion polls are being used and reported in the media, and what pitfalls lie there.

Today, for instance, i want to talk about the “voodoo poll”, so called because it’s about as scientific as voodoo (and presumably because for the serious researcher seeing it reported as a serious poll in the media feels like a stab in the heart from a distance).


A “voodoo poll”, or open access poll, is one where a non-probability sample of participants self-select into participation.

In human language: sampling is the use of a subset of the population to represent the whole population. In probability sampling (random sampling), we have ways of calculating the probability of getting any particular sample, and therefore we can rigorously infer from the sample to the general population.In non-probability sampling, we do not; and therefore we need to use them with care.

Continue reading

Headlines in social sciences: visual methods and research PR


Gillian Rose writes in the Sociological Review about the relation between ‘visual research methods’ and contemporary visual culture: “One of the most striking developments across the social sciences in the past decade has been the growth of research methods using visual materials. It is often suggested that this growth is somehow related to the increasing importance of visual images in contemporary social and cultural practice. However, the form of the relationship between ‘visual research methods’ and ‘contemporary visual culture’ has not yet been interrogated.” Read her article here.

In the meantime, on the LSE’s Impact Blog, Alasdar Taylor warns thatmore science reporting is being done through press releases, many of which tend to exaggerate original research”. “Churnalism”- the practice of reporting press releases or wire copy ad verbatim as news stories is becoming increasingly common, and scientific journalism makes no exception. Consequently, “as the field of science journalism has contracted, the science PR industry has grown to fill the vacuum”.

The problem is that, as  Dr. Andrew Williams, a lecturer in Cardiff University’s School of Journalism, Media and Cultural Studies shows in a recent study, “a sizeable proportion of university press releases (30-40%) exaggerated or hyped the research findings or made them more determinist. They also added causal reasons for correlations, made extrapolations from animal research into humans and added other inferences not present in the original publication.”

The recent case of David Cameron presenting misleading/improperly read statistics in a Daily Telegraph article only highlights that this is a discussion we need to be having: while we have access to more information than ever, our attention spans are, if anything, shorter. Images and statistics are both powerful tools for shaping our perception/understanding of the world; but as we attempt to digest as many information as possible in simplified, quick-and-easy form, the picture we get may be (deliberately or not) distorted.

What can official employment statistics tell us? UK Statistics Authority Chair rebukes David Cameron


So, David Cameron claimed in a Daily Telegraph  article that “while most new jobs used to go to foreign workers, in the past year more than three quarters have gone to British workers”, presumably due to the Coalition’s immigration policy.

Except no, not really. All that the figures from the Office for National Statistics (that Cameron is basing his claims on) are telling us is that UK nationals  made up 76% of the increase in the number of people in work last year. Sir Andrew Dilnot, chair of the UK Statistics Authority, offered a detailed rebuttal of the PM’s use of data; which I believe anyone using official stats to make use of data to make a public policy point would benefit from reading.

So, let’s see:

Continue reading