Open and shut cloud

October 16, 2011

Following on from the previous post about cloud and open source, I discover a number of further reports with varying opinions.

First of al, the September/October 2011 edition of ITU – UKauthorITy in Use has a report by Dan Jellinek of a live debate on Open Standards in the public sector. In it Bill McCluggage states that the UK government is set to publish a draft list of ten or twelve open standards that are likely to include ones such as HTTPS, Unicode UTF8 and word processing formats. McCluggage also disagrees with the PASC recommendation that government “should omit references to proprietary products and formats in procurement notices”, something I find rather strange, when this is clearly a useful, if limited, way forward.

In contrast a story in the MIT Technology Review by Michael Fitzgerald dated 6 October 2011 entitled “Can an Open Cloud Compete?” agrees with an earlier opinion stating “most cloud services are proprietary, and the technology used to run them is kept secret. Once a company signs up for one cloud service, it can be difficult to move to another provider”. As a contrast it then goes on to describe the OpenStack conference in Boston, USA and some other new projects developing both open cloud software, and hardware.

A further opinion supporting open cloud is in the May 17 2011 Linux Journal in a piece by Bernardo David entitled “Open Source Cloud Computing with Hadoop” describing the large number of major users of Hadoop including Google, Amazon and others. The risks involved in cloud are highlighted in an article in Guardian Government Computing 10 October 2011 by Mike Small explaining that “the cloud is not a single model, but covers a wide spectrum from applications shared between multiple tenants to virtual servers used by one customer”, something I described before.

With these contrasting opinions of whether cloud can be open or even whether when using open source it remains open, the only option for the user is to ensure that any procurement notice or contract ensures that they are not locked into any form of proprietary software or hardware by the supplier, patentee or owner of any intellectual property involved. A veritable legal minefield until the dust cloud settles.


Social media mining – getting it wrong

August 28, 2011

The MIT Technology Review reports in an article entitled ‘When Social Media Mining Gets it Wrong’  how placing too much relevance on social network data can come up with the wrong answers, particularly when one is employing facial recognition software. The researchers reported that whilst on a third of occasions they were able with face recognition to get to the research subjects’ Facebook pages and from there determine part of their social security number, along with other facts. As the paper recognises, getting a third correct means that two-thirds are incorrect and on this basis discourages assumptions being made from such searches. The researchers even developed a mobile ‘phone app to do such work – imagine if the police were to employ such technology openly, the number of wrongful arrests that might be made!

The MIT paper also reveals other research done on Facebook pages, but again recommends avoiding the use of such data for crucial decisions – you have been warned!

History lesson

May 11, 2011

Two recent stories consider how we have now got to e-government (indirectly). The first was posted on the MIT Technology Review and reveals some of the background behind the Internet’s development, less a case of a military requirement than one of sharing resources and developing open source.

The second was a posting on Richard Heek’s blog ICTs for Development entitled ‘The First e-Government Research Paper’, which reveals a paper from W. Howard Gammon published in Public Administration Review in 1954. The paper, whilst it couldn’t actually use the term ‘e-government’ which appeared in the 1990’s, was about government ICT. Richard picks out eight examples of advice that Gammon gave to those considering government ICT, all of which remain appropriate today, particularly now we have the DWP doing its own thing around Universal Credit!

The highlights include:

  • knowing what must be done rather than the technology to do it
  • employing trials
  • looking at the system first
  • top management support
  • hybrid management
  • procedural, economic and social problems must be resolved first
  • impact assessment needs total cost of ownership
In April 2008 I wrote a post about ‘History repeating itself‘, one would hope humankind and its politicians, and enforcers, will someday learn from history.

Why we need to involve the “local” end users

September 30, 2009

It doesn’t matter how we design systems, there are cultural considerations to be taken into account as Bill Moggridge, founder of IDEO, outlined  at EmTech@MIT 2009 and as described in the MIT Technology Review, demonstrating the need for some difficult considerations that go into designing for a connected world.  Two examples are provided of the need to be aware of cultural constraints that might exist when employing technology.

However, this is an international perspective and we need to be aware of constraints on using services at all levels of employment. How best can we do this? Well, we could watch the locals, as Moggridge demonstrates, an academic practice known as ethnography, which can be quite lengthy and time consuming. Another method might simply be to involve the “locals” in the design, implementation and testing of any application, electronic or otherwise.

Due to the constant change in the adoption or use of different communications media, this consultation needs to be ongoing and is not necessarily about technology, it’s primarily about the processes behind the wires and chips.