NI14 – the latest!

May 21, 2009

The IDeA have published a four page NI14 update IDeA May 2009 on their Community of Practice web site on the one around National Indicator 14.

The reason for the publication is that the closing date for submission from councils was 30th April 2009 and 350 have apparently submitted. As they are kind enough to highlight the Audit Commission has advised that NI14 is a non-comparable indicator i.e. not a TARGET.

They also state that “The IDeA will continue to gather evidence of both improvements to customer experience and efficiency savings resulting from NI14 data being used as a lever for service improvement and capacity building.”

I’m also examining the use that’s being made of NI14 in my own survey, but also looking at other options.

Advertisement

Why don’t you listen? You might learn something!

March 4, 2009

Hot off the e-press comes a report from the Improvement & Development Agency (IDeA) entitled “Information needs”.  This may have some guidance for those in the public sector, in particularly chapter 9, which highlights the public confusion about roles and tasks, and may encorage us in signposting services when considering the different modes or channels of delivery.

http://www.idea.gov.uk/idk/core/page.do?pageId=9436080

There’s also a new document from the Department of Health which may be worth sharing across the public sector. The title is: “Listening, Responding, Improving – A guide to better customer care”. They’ve published a quick guide, along with the main document, which might save some eye strain. As with the outcomes of my research, much of it would be ‘common sense’, in the way that we need to actively listen to the public’s comments around services and, importantly, actively respond, either by changing them accordingly or explaining why they are the way they are! It is all about moving the culture from one of counting complaints to responding to feedback and treating it with value.

http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/PublicationsPolicyAndGuidance/DH_095408


Going critical!

November 25, 2008

It’s just out, and despite its few pages and quite large download size (>1Mb) I haven’t read the final version yet.

Yes its: “insight: understanding your citizens, customers and communities” – the report from RSe commissioned by the IDeA, that incorporates the ‘wholesome’  bits from IDeA Community of Practice online conference in the summer, plus added feedback and examples from those who had anything to provide.

The picture of “babushka” on the cover coincides nicely with the other document I was linking it to: “Critically Classifying: UK E-Government Website Benchmarking and the Recasting of the Citizen as Customer” by Benjamin Mosse and Edgar A. Whitley of the Information Systems Group at LSE. The version I’ve just read is from the latest “Info Systems Journal” but I’ve since found a working paper on the LSE web site and a conference paper from 2004 or thereabouts. Not easy reading, even for me with a first degree in philosophy, along with many hours working on Heidegger and his continental brothers and sisters, but inline with the “babushka”, Mosse & Whitley use the onion skin analogy to describe Heidegger’s Ge-stell theory , the selective or uncritcal representation of the real world and how web site benchmarking can become caught up in this!  They have also picked up on the danger of using the citizen as customer metaphor, which was a bee in my metaphorical bonnet throughout the IDeA online conference, although I was deriving my argument from older philosophers, the Greeks, but I have also employed Hirschman’s theory of exit, voice and loyalty and other sources! They also pointed me to a lot more reading on the customer versus citizen debate.

I do hope the IDeA report is easier going…


Researching local government, Web 2.0 and Service-oriented architecture

September 2, 2008

Adrian Barker of the IDeA posted the following on his blog on the 30th August 2008

“Practical, up-to-date research?

Read a fascinating article [1] today which uses a mixture of statistical techniques and semi-structured interviews to show that inspection on its own has no impact on performance, that a strategy of innovation does, and that supportive inspection can enhance an innovative strategy.

Good as this is, it was published in 2008, the article was submitted in May 2006, based on fieldwork in 2002-3.  I don’t blame the researchers: that’s the way the system works, but there must be innovative ways to combine current practitioner insights with academic rigour to produce practically useful research with rapid turnaround.  Sounds like a job for LARCI.

[1] Rhys Andrews, George Boyne, Jennifer Law and Richard Walker, ‘Organizational Strategy, External Regulation and Public Service Performance’, Public Administration, Vol. 86, No. 1, 2008 (185-203).

http://www3.interscience.wiley.com/journal/119395659/abstract?CRETRY=1&SRETRY=0

I actually got to it later at:

http://www3.interscience.wiley.com/cgi-bin/fulltext/119395659/PDFSTART

In response I posted the following on 1st September 2008:

“Adrian

Interesting post and I’ll read the paper once I’m at home – work system doesn’t like cookies and Interscience forces one!

As a current practioner/researcher (as you know) there are reasons for this that LARCI won’t influence:

My current research questionnaire (13 questions) has managed around 31 responses – hardly statistically significant but my supervisors consider that good for local government! In fact when I circulated the link to the questionnaire at a meeting the President of Socitm stated that he didn’t respond to academic research because ‘once they’d got you they never let you alone’. So researchers find getting feedback out of government like getting s**t out of a rocking horse! So not a popular topic.

Papers take some time to turn around with peer-review unless you are lucky! If a reviewer reccomends major changes it can take ages and then has to go through the process again! Luck has a lot to answer for. I submitted a paper at the start of this year April and the conference is in September – that’s pretty fast! I’m currently drafting abstracts for 12 months ahead – you also need to be psychic!

I know people get bothered by undergraduate or MBA researchers but the only way to train people is to let them loose! I am provisionally presenting work through ESD-toolkit and EIP to get it out to the practitioners along with the blog – any other suggestions? I’m not sure the CO, DCLG or AC want to know the truth otherwise they might assist but I’m always impressed by Audit Scotland and CIPFA who circulate some good stuff!”

Research was never easy but practical research in the government community is a cross only a few demented people seem to chose!

Dave Briggs circulated the following:

“Just a quick note to inform you all about an event I’m running with Peterborough City Council for local government types to find out about what’s going on in the sector with social media, web 2.0 and whatnot.

More info at http://davepress.net/2008/09/01/readwritegov/ with booking at http://readwritegov.eventbrite.com/

To which I reponded:

“Thanks Dave, I’ll circulate to colleagues – is it in competition with the Socitm event? 😉
http://www.socitm.gov.uk/socitm/Events/Web+2+seminar+10+September+2008.htm
By the way, I was reading up on the history of SOA (service oriented architecture), which was posited by a Gartner consultant (Yefim Natis in 1996) and there is a recent Gartner paper suggesting that Web 2.0 is distracting from SOA, which should be the real concern. Its one of those front versus back office dialogues. This is in the general business sector.

For the public sector, to confuse metters, I’m trying to develop a Citizen Oriented Architecture which is a mix of front office and performance tools that could then meet with the back-office SOA.

Any views on SOA versus Web 2.0?”

And I’ll ask here, too – any views on SOA versus Web 2.0 – is it the cart before the horse or what? Of course one does need to have done one’s system/process stuff before implementing SOA but scraping, blogging and mashups are very front-end tools!


IDeA NI14 Guidance and GovMetric

August 2, 2008

Public Sector Forums have made a great deal of the fact that the IDeA guidance upon NI14 promoted GovMetric and only GovMetric as a possible solution.

I’ll declare some interests here, I have met wil rol the company that produce GovMetric and over a year ago had an academic discussion with them about the while concept of customer satisfaction and channel migration.The council I work for currently employs the Socitm solution for doing web site evaluation which partialy employs a tool produced by rol, who are working with Socitm to do service benchmarking. I am also a Socitm member, a member of my regional Socitm executive and also on the Local Government Chief Information Officer Council, which Socitm were recruited by central government to create.

I like the concept of GovMetric and haven’t seen anything other than built in CRM tools to match it and of course they don’t all come with the templates for web sites or a complete and designed-for-purpose suite of tools. There is Opinion-8, which I believe doesn’t work quite the same way either.

I’ll agree that it was daft for the IDeA to nominate one tool, I don’t think they could have avoided promoting the ESD-Toolkit, since its their child! However, I have yet to find anything conceptually up to GovMetric. We asked our web developers to build a tool into the web site CSS to collect feedback and they wanted a lot of money, it would probably have contributing to buying GovMetric, which isn’t cheap, and tying up the other channels!

What’s the solution? Horses for courses, I suspect, by the time people get around to trying to collect NI14 data manually they’ll realise what a time waster it is and plump for an electronic tool. What is needed in collecting the data is rigour and an awareness that NI14 is not the answer, the answer is feedback from staff and citizens about the systems we use, be they delivering answers by the web, telephone or face-to-face. We need to collect that feedback and act upon it but at the same time supply the required indicator.

Why do we need to do that? To instil confidence in the public that we mean to change, to transform. We do mean to do this, of course, but we need to demonstrate it! We also need to placate the Minister!