Although the title is slightly facetious, have you ever wondered what happened to all that Freedom of Information data that government bodies supply to a seemingly endless sea of requests? I sometimes do. However, it’s all becoming clear now thanks to Francis Maude’s recent insistence that it should be in machine-readable format and a little piece in the Guardian. Not being a Guardian reader these days I must thank en.europa-eu-audience for the heads-up on this one.
This tidy little piece, with lots of hints and tips, has given me a greater understanding of why newspaper reports, such as the web costs one, can appear so poor -
- Less than 100% response rate to FoI request
- Inconsistency across data supplied
- Lack of clarity of FoI request
- Misinterpretation of data supplied
In academia one is challenged if both the data and analysis are not robust enough, however journalists have always been prone to expressing conclusions based on data with dubious analysis and origins. The frightening thing now is that the graphical tools are so easy to use!
It is one thing Eric Pickles, Minister at the DCLG, demanding the data for local ‘armchair auditors’, it is another when ‘armchair’ journalists add two and two and get twenty-two.
Great tools, but make sure the rigour is there before publishing please…
In fact I’m not alone in thinking open data presents it own issues, Webmonkey has recently noted this too. However of the four proposed solutions to it, where the two of universal broadband and training will come from, I don’t know. On that basis, the promoting and formatting of the data are the least of our worries.