March 4, 2012
Two different but interesting pieces in the Computer Weekly of 28 February 2012 on the topic of ‘open’. The first is an opinion piece by Tony Roberts entitled ‘The problem with open data’ and the second by Mark Ballard on open standards. Tony Roberts states that open data as currently practiced is likely to increase the digital divide and that what is wanted is actionable open data, along with training on how to use it. I don’t think anyone can argue that the exercise in openness to-date has probably had little impact on the average citizen.
Mark Ballard examines the cleft stick facing the government having proposed that software using open standards should be preferred when procuring systems. One would have thought this would be quite an easy path to follow but the government has even been threatened with expulsion from the International Standards Organisation who thought their version of ‘proprietary’ was in jeopardy. The government has at least one Member with a vested interest in trade protection, and thus not entirely into open standards. Whilst the Coalition has launched a consultation to define open standards, there exists a body that I’ve already blogged about OASIS that has probably already done that (for the web and cloud anyway)but doesn’t get a mention in the article.
If, as reported, 70% of all software licences bought by the UK government are for Oracle something certainly needs to be done. One can move to Office Libre rather than Open Office to get away from Oracle on the desktop, but what does one do in the database market. Even in the local government market Oracle rules the roost, when as well as charging a lot constrains the use of virtualization platforms other than its own by its extortionate licensing model. Oracle does claim an interest in open standards with MySQL and other open source products, but developers need to be weaned off the costly commercial stuff.
February 22, 2011
In government IT one get’s used to the fact that you can’t win. In my view the answer to the Balkanization of government IT is more efficient sharing of applications, better procurement of applications and better sharing of data (if and when required), in other words the G-Cloud. The Cabinet Office in the UK has now released the documents envisioning G-Cloud Programme Phase 2.
Unfortunately, some people see this as another way of sharing citizen data amongst government and a mechanism of circumventing the now dumped ID card project! The latest portrayal of this view is in Computer Weekly.
In contrast to this view I’d argue that if we are to reduce the number of government data centres, reduce the cost of data connections, cut the charges paid to a shrinking band of suppliers (who have local government by the short & curlies) and reduce support costs, we have to look to another way of delivering IT services. G-Cloud can be the only way.
G-Cloud gives government the opportunity to dictate standards, quality and support to a level that the current regime of ‘divide and rule’ by suppliers has never permitted. It starts to give government IT the upper hand for once, so I can see why some won’t suppliers like it, but as to willy-nilly data-sharing – I don’t think so!
December 28, 2010
Philip Virgo’s excellent blog post in Computer Weekly of 21 December 2010 about how to suitably design the new Universal Credit system deserves reading by all who might influence the process of delivering such a service.
Whilst many of the of the comments should be injected into the PASC call for written evidence, Philip’s penultimate paragraph is suitably ‘systems thinking’ – “To really help those trying to help better themselves, we require systems that assume chaos and unpredictability. That will entail giving front-line staff responsibility for holistic support and the ability and authority to over-ride the “system”.”
A suitable ‘New Year’s revolution’ in government thinking…
October 25, 2010
Mark Ballard blogging on ComputerWeekly.com following the Comprehensive Spending Review announces it as the death-knell of transformational and e-government, along with a comparison of the Blair Modernising Government programme and all it failed to deliver. In many ways I tend to agree and have blogged about the programme’s demise here before.
However, if project management has taught me one thing, it’s the need for a post-implementation review, and I would hope for an overall one to assess the programme. When did this occur? I’m afraid in the world of politically inspired initiatives they never happen, Ministers move on, people move on and the game continues, for as Ballard notes “after all the Conservative hoopla about an end to Soviet-era IT projects, the Chancellor promised £2bn for the DWP to create a system of Universal Credit“. Has anybody ever basically assessed the difference between “rates”, “community charge (poll tax)” and “council tax”, and whether the billions spent on them made life any better? Similarly the benefits systems that have to be applied to compensate for those that can’t pay?
Much of this type of bureaucracy revolves around what Paul Henman labelled the “New Conditionality” and whilst the technically challenged politicians may not recognise it, they are exploiting technology to the extreme to deliver their policies, which are so complex, the systems are unlikely to ever work without massive human intervention and great cost!
Whilst “New Labour” with its “Modernising Government” and e-government programmes largely carried on from its predecessors in control, this time the political will has overridden any rationality. Savings will be made, money will be wasted and thousands, in the wrong place and time, will lose their jobs.
Don’t forget to vote for the Great E-mancipator
January 4, 2010
Tucked away amongst the Christmas holiday reading was a post on his blog in Computer Weekly by Philip Virgo entitled “The case for e-Government values your time at zero.” Philip, of course, has something of a connection with electronic government having been Secretary General of EURIM (the Parliamentary/industry Information Society allance) since 1996, so should be worth listening to. It’s also not the first time he’s been mentioned here, about “Why e-government fails“!
Unlike the title infers, this is less an indictment of e-government for not delivering, than a critique of the actual use of technology and the many who are excluded for one reason or another. For example the second paragraph starts with “Most ICT surveys count “users” of a product or service as those who have used it at least once. They consequently delude themselves and their marketing departments with claims of market size and share.” This a common failing of many of the rationales for e-government expansion.
In the penultimate paragraph he also reminds us that: “The growth in the number of elderly, with a consequent growth in numbers with impaired eyesight and/or hearing, calls into question the growing reliance on screen and keyboard or call centre for contact between those in need of service in the inner cities, suburbs and rural areas and those delivering it to them.” I read this as a rejoinder to preserve, faciltate or develop quality mediated services whenever electronic government is thought of.
Philip states that he intends to blog his submission to the “Ideal Government challenge” shortly, and encourages us to bear his comments in mind if we do so.
September 20, 2009
Computer Weekly of the 15th September 2009 includes a piece by Ian Grant on Dr David Osimo’s presentation to the European Network and Information Security Agency summer school under the title “E-government success depends on external expertise”
Coincidentally, Dr David Osimo is a managing partner at tech4i2, a consultancy founded by my friend Professor Paul Foley, formerly of De Montfort University which examines a range of practical issues around electronic government, so I was interested to read it. Especially as I was attending a meeting with members of the Local CIO Council at Sunningdale on the subject Public Sector Network (PSN) at the time.
Osimo points out that ICT has not fundamentally changed government in Europe with 50% of services fully interactive and only 9.3% of citizens using them. The answer to which he sees as Web 2.o solutions being delivered by people outside of government, my favourite of his examples being Patient Opinion.
He then goes on to propose a model for Tao government, with which I have no arguments but rather than being anything “techie”, this is a change to democracy and government as we’ve known it and, without a revolution, I don’t see it becoming much more than a facade that citizens will soon tire of.
September 16, 2009
A recent Computer Weekly (8 – 14 September 2009) contains a piece entitled “Hard Times for local government IT” written by Dr Simon Moores a Conservative district councillor and former advisor to Tony Blair! Strangely, it’s largely the content of an earlier posting from his blog. I tend to agree with his conclusions to the state we are likely to be in, but as one of advisors behind the e-government race, I think he should consider his role in bringing us to the current situation we’re in.
The rush to 100% targets with little process improvement brought us to a place where, in order to share services, we are trying to rationalise a vast range of systems without any standard architecture. He bemoans his own council’s situation for being on Groupwise and “fat” desktops – I moved my own towards “thin” some six years ago against some resistance and avoided Novell at the outset, however many neighbours still use Novell and still employ “fat” desktops, which can limit some of the “quick wins”.
Many authorities and government are forced down the Microsoft path by interfaces and joining up, open source won’t make things easier, if anything it will possibly make them harder.
IT is just the glue of service delivery, e-government is just a group of channels to deliver information and services. What is needed is standards for applications to enable them to be shared across boundaries.
Will a change in government bring that?