04 August 2011
Yesterday, the BBC was among many news outlets that was fooled by a press announcement made by an organisation calling itself "Aptiquant", which claimed that people using the Internet Explorer web browser had lower IQs than those using other browsers. The announcement claimed the result was based on administering IQ tests online and then looking at the browser the test participant used. It was all utter cobblers.
As the developer behind it confessed, it was just a prank he set up to raise awareness of the problems Internet Explorer causes web developers. In that sense, it's not really a particularly well set-up prank, because the story attacks the users instead of the browser designers. And it doesn't make much logical sense either: I'd actually expect the average IQ of IE users to be lower than that of other browser users because IE is installed by default on most computers, and other browsers must be installed by a reasonably clued-up user. There's not necessarily a link between IQ and computer literacy, but a mass market browser that's installed by default is bound to represent the average web population, which would have an average IQ. I wouldn't be surprised if people using Safari, Chrome and so on had a higher average IQ between them, because they aren't representative of the general population, and are skewed towards those with more intellectually demanding jobs (IT, web design and so on).
Anyway, the BBC published a story announcing it was all a hoax. When putting together its story, the BBC consulted Professor David Spiegelhalter of Cambridge University's Statistical Laboratory, who said: "I believe these figures are implausibly low - and an insult to IE users." The implication in the BBC's follow-up story is that they did their job because they included his alternative view in the write-up.
But is that good enough? They had two sources: an unknown company that issued a press release claiming IE users are dim; and a respected statistical scientist who said he thought the data was "implausible". Instead of believing the scientist and spiking the story, the BBC believed the press release from a company it had never heard of. The company didn't even really exist, it turns out.
So here are some questions journalists and bloggers should be asking themselves following this hoax:
- Does everyone have an equal right to coverage in my publication, or does the credibility of the source matter?
- Is it more important to publish the truth, or to publish a balanced report?
- What do I take responsibility for when I attach my byline to a story?
- Does my publication have to cover everything, or can I spike stories I don't believe are substantiated?
- Is this really news?