Local Government Digital Service

November 18, 2012

In September 2012 I wrote about the Local Government Data Service but since then we’ve seen the publication of the central government Government Digital Strategy, and yet again questions have been asked about why local government hasn’t one or doesn’t get a mention. My riposte is that local government was doing this before the GDS, and it was largely set out in the Socitm publication Planing the Flag. Meanwhile Socitm has published a briefing entitled “The new Government Digital Strategy: what should local public services take from it?”

Whilst the Socitm briefing is largely a promotion for its website take-up and channel benchmarking services, all that is required by any local authority is to actively gather feedback from its service users about the different channels on offer and to use this to improve them. If this makes possible a shift to channels that are truly cheaper to deliver by web or telephone all well and good. I am, of course, ignoring the ‘digital by default’ diktat within the central strategy. In national terms this means the sharing of best practice amongst local authorities and a lot of cooperation by suppliers in helping to improve delivery, not just raking in short-term profits. This is where open source and open data come in – if the commercial applications use apps that can be cross-fertilised with others and the data can be similarly exposed (securely) across applications the benefits to both councils and citizens will soon become general.

Whilst the Cabinet Office report admits that “most public services are provided by local organisations such as local councils and the NHS”, instead of ignoring local government and starving it of resources, central government needs to cooperate properly and assist in making these changes real. So whilst I congratulate the GDS on producing its strategy I will observe whether it gets the rest of central government to cooperate, and whether it actually cooperates with those areas where “most public services are provided”. I’d also appreciate it if there were fewer questions about why local government isn’t do the GDS thing, and a greater appreciation of the fact that it was there first, just with much less of a marketing team…

Advertisements

Policing

September 2, 2012

Policing is a public service that doesn’t often get viewed as a system or as a system of parts in the same way that health or government are. That was until Simon Guilfoyle, John Seddon and others looked at it. Simon is a serving officer with an interest in systems thinking and I had the pleasure of seeing a presentation by him earlier this year at a NET2 meeting. Following the meeting he kindly forwarded me a recent paper he’d had published entitled “On target? – Public Sector Performance Management: Recurrent Themes, Consequences and Questions“, Policing (2012).

As the paper’s title infers it puts policing performance management into the same context as the rest of the public sector with all the bad practices that are frequently pointed out there. In line with the theme of this blog there is the notion that public satisfaction rates are a potential indicator, although some refining may be required to gain understanding in context i.e. cold feedback won’t do on its own. The paper also warns of the likely effect of  gaming when employing emotive targets, something that Simon went into some detail in during his presentation.

In this context John Seddon is just starting his “The Evidence Tour” and launching the second volume of “Delivering Public Services That Work”. The presentations are free and I recommend those involved in public services give him a listen and ask questions. With the introduction of elected police commissioners later this year the whole matter of police performance targets is likely to take on added weight as pre and post election gaming occurs.


Generation Y e-government

August 22, 2012

In a recent article on IT Use for Australian Business it is revealed that the Department of Broadband, Communications and the Digital Economy (DBCDE) deputy secretary Abul Rizvi had identified a “worrying” drop in the use of online government services between 2009 and 2011 and the department were investigating the use of video to deal with Generation Y. The drop in usage is identified in a report from AGIMO that is linked to (PDF, 96 pp, 3.24 Mb) but in my view this is not a surprise since the same experience was reported from Canada some while ago, and is probably only the fallback from the initial surge in trying new technologies and finding the experience less than ideal

Rather than throwing money at new technologies to resolve the issues around service, the solution is to examine the process, online and offline, and find what the problems are from the citizens’ view, and then sort out that process – be that caused by legislation (overly complex) or waste.


Can channel shift be forecast?

June 28, 2012

Goss Interactive are now offering a Channel Shift Return On Investment Calculator, apparently developed in conjunction with Plymouth University. Whilst I admire Goss’ marketing efforts in these mean times, I would suggest that any such calculator is little more than a wet finger in the air to determine the wind direction. Of course one can insert numbers of face-to-face and telephone transactions into a spreadsheet, crank the handle and be presented with what ‘in theory’ would be saved in human resources if the same transactions were done online – assuming again that:

  • All back office applications were interfaced with the web applications in a bi-directional manner? (This also assumes that the capital and revenue costs for the interfaces are built into the same spreadsheet?)
  • The same spreadsheet will also have a calculation for lack of take-up by the digitally unentitled (those without or not wishing to access electronically), the ones that will still telephone in or visit to ask questions?
  • There will be the staff costs of maintaining the web site, back office systems etc under ongoing government changes, new legislation or other factors that come out of the future?
  • That peak trend of applications that the public are willing to undertake online, without human intervention? (estimated at 30% from Canadian experience)

However, one could deny the citizen any face-to-face or telephone access to make the savings, as was done in the private sector? But look what happens when the computer plays up – NatWest, RBS, UlsterBank all open up for extra hours

Are all the costs of going digital truly in there?


Six stage digital engagement

May 27, 2012

Thanks to GovTech for pointing me to CivicPlus’s attempt to sell web services to government by telling them there are six stages to engagement. It’s actually a US company so the questionnaire involved is focused on the needs of US citizens but even so is quite amusing by its assumptions. I thought I’d complete it as a citizen (one of the choices), and after a few minutes had done it! If only life were that easy…

CivicPlus label the six stages – static, emerging, active, receptive, participatory and fully-engaged and I state again, if only matters we that simple…


Good progress

May 6, 2012

It has recently been down to UK Member of Parliament Michael Dugher to try and determine the state of the G-Cloud and Greening Government IT Strategy. In an a list of questions and (sort of) answers published in Hansard that will have amused journalists by their vacuity, the Minister for the Cabinet Office, Francis Maude effectively responds by saying that all will be revealed in the near future in the annual reports. I do know that the Green IT Strategy was in preparation when I was last in conference with the Green Development Unit in March 2012, but the bigger wait is for the ICT Strategy annual report itself.

The major revelation from the questioning was that, at a cost of £4.93 Million the G-Cloud is expected to save an estimated £340 million, which is amazing! I wonder if this saving includes that from the Public Sector Network (PSN) or is it purely from the cloud? Over what period will that saving be made – five years, ten years, twenty years? However when Mr Dugher asked Mr Maude about the number of data centres government maintained in May 2010, March 2011, September 2011 and March 2012, all Mr Maude could say was that “In February 2012, Cabinet Office collected baseline information on the number of data centres maintained by Departments in order to progress commitments to consolidate and rationalise data centres to help save energy and costs in line with Government ICT Strategy. This information will be published alongside ICT Strategy annual update report, due shortly. Information on the number of data centres across Government prior to this February 2012 is not available.” However, back in May 2011 some figures were provided by Cabinet Office to the Public Administration Committee dated 30 March 2011 in a written answer stating “A survey commissioned by CIO Council during June 2010 identified 220 Data Centres across Central Government”, which I suspect was an underestimate since I clearly remember someone, possibly Andrew Stott, quoting a figure nearer 400 to the Local CIO Council a couple of  years ago.

When Michael Dugher asks the Minister for the Cabinet Office what progress he has made on the implementation of G-Cloud computing, the response is a resounding “The G-Cloud programme is making good progress”. I’m sure you’ll all be pleased to know…


Channel choice

April 17, 2012

A recent paper in the Government Information Quarterly 29 (2012) by Christopher C. Reddick & Michael Turner is appropriate to the UK debate. The paper is entitled ” Channel choice and public service delivery in Canada: Comparing e-government to traditional service delivery” and it looks at some of the excellent work done in recording citizen satisfaction and other metrics in a range of Canadian jurisdictions. I’m a little confused by the definition of e-government since they state on page 9 “Through a survey of citizens across Canada there was evidence that e-government has really taken hold as the dominant contact channel, with 55% of Canadian residents surveyed used the Web or email to contact government for a service or information, which rivals the phone at 51%”, which is confusing with the inclusion of email, which is little better than quick ‘white mail’. However, it then goes on to state that “the data indicates that citizens actually received the most satisfaction by receiving a service or information in a government office”, which is probably the same in the UK.

Interestingly, it then goes on to state “There appears to be a digital divide in access to e-government in Canada and it is centered on age and gender, but its cause may not be attributable to simply differences in access. The digital divide can be mitigated if there is greater citizen satisfaction with e-government”, which I can’t disagree with, although the divide in gender terms is nominally marginal in the UK.  A further conclusion is that “governments should realize that citizens use many contact channels, and often several in a single interaction or transaction with government, with some of them being better suited for certain tasks than others. However, governments should realize that citizens receive less satisfaction with the phone [and that] they must find better ways to integrate contact channels as one method to move e-government forward, ensuring that the information received through use of different channels is consistent and service responses are of equivalent quality. Then, where citizens have multiple choices to contact government, they can use the channel that best suits their needs”.

Once all the channels are being measured for satisfaction and re-tuned as a result, there will be, as stated, “a positive view of all contact channels [which] leads to a positive overall view of public services, so governments will need to continue focusing on service channel improvement to improve overall views of public service – the very model I have been promoting for some years. However, as a warning to some of those pre-occupied with benchmarking services the report concludes “collecting aggregate survey data is limited because of its inability to discern nuances in the data which can better be teased out with more direct methods of observing citizen behavior”, so be warned!