Good progress

May 6, 2012

It has recently been down to UK Member of Parliament Michael Dugher to try and determine the state of the G-Cloud and Greening Government IT Strategy. In an a list of questions and (sort of) answers published in Hansard that will have amused journalists by their vacuity, the Minister for the Cabinet Office, Francis Maude effectively responds by saying that all will be revealed in the near future in the annual reports. I do know that the Green IT Strategy was in preparation when I was last in conference with the Green Development Unit in March 2012, but the bigger wait is for the ICT Strategy annual report itself.

The major revelation from the questioning was that, at a cost of £4.93 Million the G-Cloud is expected to save an estimated £340 million, which is amazing! I wonder if this saving includes that from the Public Sector Network (PSN) or is it purely from the cloud? Over what period will that saving be made – five years, ten years, twenty years? However when Mr Dugher asked Mr Maude about the number of data centres government maintained in May 2010, March 2011, September 2011 and March 2012, all Mr Maude could say was that “In February 2012, Cabinet Office collected baseline information on the number of data centres maintained by Departments in order to progress commitments to consolidate and rationalise data centres to help save energy and costs in line with Government ICT Strategy. This information will be published alongside ICT Strategy annual update report, due shortly. Information on the number of data centres across Government prior to this February 2012 is not available.” However, back in May 2011 some figures were provided by Cabinet Office to the Public Administration Committee dated 30 March 2011 in a written answer stating “A survey commissioned by CIO Council during June 2010 identified 220 Data Centres across Central Government”, which I suspect was an underestimate since I clearly remember someone, possibly Andrew Stott, quoting a figure nearer 400 to the Local CIO Council a couple of  years ago.

When Michael Dugher asks the Minister for the Cabinet Office what progress he has made on the implementation of G-Cloud computing, the response is a resounding “The G-Cloud programme is making good progress”. I’m sure you’ll all be pleased to know…

Advertisement

Channel choice

April 17, 2012

A recent paper in the Government Information Quarterly 29 (2012) by Christopher C. Reddick & Michael Turner is appropriate to the UK debate. The paper is entitled ” Channel choice and public service delivery in Canada: Comparing e-government to traditional service delivery” and it looks at some of the excellent work done in recording citizen satisfaction and other metrics in a range of Canadian jurisdictions. I’m a little confused by the definition of e-government since they state on page 9 “Through a survey of citizens across Canada there was evidence that e-government has really taken hold as the dominant contact channel, with 55% of Canadian residents surveyed used the Web or email to contact government for a service or information, which rivals the phone at 51%”, which is confusing with the inclusion of email, which is little better than quick ‘white mail’. However, it then goes on to state that “the data indicates that citizens actually received the most satisfaction by receiving a service or information in a government office”, which is probably the same in the UK.

Interestingly, it then goes on to state “There appears to be a digital divide in access to e-government in Canada and it is centered on age and gender, but its cause may not be attributable to simply differences in access. The digital divide can be mitigated if there is greater citizen satisfaction with e-government”, which I can’t disagree with, although the divide in gender terms is nominally marginal in the UK.  A further conclusion is that “governments should realize that citizens use many contact channels, and often several in a single interaction or transaction with government, with some of them being better suited for certain tasks than others. However, governments should realize that citizens receive less satisfaction with the phone [and that] they must find better ways to integrate contact channels as one method to move e-government forward, ensuring that the information received through use of different channels is consistent and service responses are of equivalent quality. Then, where citizens have multiple choices to contact government, they can use the channel that best suits their needs”.

Once all the channels are being measured for satisfaction and re-tuned as a result, there will be, as stated, “a positive view of all contact channels [which] leads to a positive overall view of public services, so governments will need to continue focusing on service channel improvement to improve overall views of public service – the very model I have been promoting for some years. However, as a warning to some of those pre-occupied with benchmarking services the report concludes “collecting aggregate survey data is limited because of its inability to discern nuances in the data which can better be teased out with more direct methods of observing citizen behavior”, so be warned!


Top management team

April 10, 2012

In the wake of the Socitm Better Connected 2012 review and reports, a further report has been published aimed at the management of UK local authorities. Better Connected 2012: a briefing for the top management team picks up on some of the results of the annual study along with opinions of those involved. It’s only 16 page so the £50 price tag is a little steep, unless you are a subscriber. The author(s) promote what they describe as eight ‘simple, clear points which can act as guiding principles’, unfortunately number eight is ‘we want public services that are more transparent’, which isn’t at all clear to me – is that the policies, data or management that needs to be ‘transparent’? The other seven are equally ‘simple’.

The service picked on and discussed around mystery shopping is that of public libraries. Possibly one of the more difficult to manage in these turbulent times with high asset value, regular revenue costs and an unpredictable market. If the library service concerned has an old software application, they’re highly unlikely to get a new shiny, all-singing one in the current climate – instead they are likely being compressed and expected to do more with less. Ultimately it may be said that going online with the latest applications, and encouraging self-service will cut a few librarian posts, but it’s a fine line in the costings.

I heartily agree with the statement on the eighth page that “council leaders and managers must accept that the main purpose of the website is to deliver services”, but currently policy dictates that it isn’t necessarily the council that is delivering services now, and the private and third sectors have their own opinions as to what their route is once they’ve taken on services and it isn’t necessarily transparency of ease of customer contact. Similarly, the twelfth page argues for lots of user testing, which I totally agree with but third-party application interfaces aren’t easily or affordably tweaked once they are in place.

Unfortunately for all the good intentions the authors are too far detached from the reality of delivering services in the current climate and whilst there is much good advice the attitude is likely to pi** off more council web managers than it will educate.


The Inbox

February 24, 2012

In terms of electronic government email is normally the poor, ignored, unmeasured relation. However, when a citizen sends an email, for example if there’s no form on the website and only a generic, or even very generic, email address, they are still expecting a response, and the fact that they know from experience that email delivery is near instantaneous means they expect a pretty quick response. This has always been the case but the reality is largely ignored. Some local authorities ping back a response that the email has been received and a reply will be forthcoming in a number of days – some even do this for personally addressed mails.

The other area for concern is how many government services count the number of mails received and analyse them in any way. Do we know if particular services get more than expected? Perhaps somebody has unhelpful information on a website requiring users to asks questions every time? Perhaps a form isn’t available that could be delivered to the right service. How do we know any of this whilst hundreds or thousands of emails are manually forwarded, whilst remaining uncounted and unanalysed? How do we determine the response expected, if we send a standard seven-day auto-answer and the sender has placed a ‘High Importance’ flag on it, how will they feel?

An interesting piece of work, complete with infographics has been published on Codeworks Connect entitled ‘Interesting Insights into Email Infographic’, which helps set the scene on user behaviour around email. Whilst not a believer in the adage that we can’t manage what we can’t measure, I do believe that in order to deliver channel shift and improve service delivery we need to know what is happening on all channels and leaving poor relations like email or telephony out in the cold isn’t good enough.


Evaluating citizen participation

February 7, 2012

One of the major difficulties accepted in the discussions around citizen participation was how do we measure it. This was presented more recently in the post ‘Participating in a Democracy’. Whilst being fully referenced and including her a new paper from the IBM Center for The Business of Government probably owes a great deal to the late Sherry Arnstein’s work on the Ladder of Democracy.

The paper entitled ‘A Managers Guide to Evaluating Citizen Participation’ (56 pages, 2.6Mb) is written by Tina Nabatchi of the Maxwell School of Citizenship and Public Affairs focuses on revising a modified version of the Ladder of Participation that was published in 2007 as the ‘IAP2 Spectrum of Public Participation’. The paper clearly identifies that there are no easy routes to evaluation and the methods outlined require time and effort tu fulfill and although there are mentions of using new media to consult there are no solutions as to measuring with them. In fact the one clear link to anything electronic from the White House website is to the proclamation that protects personal data.

However, as I’ve stated, the big thing is to get participation right, then e-participation will come naturally (with trust), so this is a good start.