This article is more than 1 year old

Whitehall's G-Cloud: Hype or hope?

Big-swinging reform or buzzword bingo?

Try before you buy

As always, along with the hype comes the prospect of customers being oversold. As government customers stumble out of the shadow of centralised IT purchasing into the bright light of choice, what's to stop them getting ripped off?

Maxwell, government director of ICT futures, told me at the recent Intellect event in London that the government's terms and conditions would provide protection against this. Maxwell: "You test them out beforehand to see. They are getting accredited. Any proven IT director is going to test these out, government or non-government."

Ferrar agrees the phrase "cloud" has become overplayed and says he stands by the National Institute of Standards and Technology (NIST) definition from late 2011 here (PDF). He said that he thinks Microsoft plays into this with Office 365 and Windows Azure, which are listed on Cloudstore. "The cloud term is globally getting to point where nobody knows what it is but everybody kind of thinks what they know it might be," he says.

There is another challenge in the rush to the cloud, however: data. Right now, users around the world are blindly shovelling terabytes of data into different cloud services simply because it is so much easier to write, develop and deploy apps that run on servers that you don't need to manage. Amazon is the one everybody is chasing and it's a data sponge: the number of data objects held by EC2's S3 storage grew nearly 200 per cent in 2011 to 762 billion: 2011 was Amazon's fastest growth year.

A cloud provider will let you trickle in data in gigabytes or terabytes for months at a time; then you hit a year and then you have no capability to move. – Tom Hughes-Croucher, Joyent

But what if you decide you want that data back or want to change suppliers? The latter is a distinct possibility in government contracts, which have have fixed-length terms and where suppliers change.

It's easy to become trapped by a discussion about standards for portability – about the way cloud providers lock you in using data formats or APIs that don't interoperate with each other. According to Maxwell, the terms and conditions required to get on the G-Cloud supplier's list combined with "open standards" will solve the data lock-in problem.

But visit the Cabinet Office's current site on standards for data transfer in the G-Cloud, however, and you won't find much there to guide your purchasing decision.

In February the government started a round of consultations on open standards on data along with software interoperability and document formats. This was its second after the government controversially scrapped its first open-standards definition – apparently after it came under lobbying pressure from, ahem, Microsoft and the Business Software Alliance (BSA). The Cabinet Office, running the G-Cloud initiative, has denied this was the reason, however.

"We are at the beginning of it," Maxwell said of the new consultation. "We are saying we need to take control of the various components in your network infrastructure and end user devices, application development, service management and systems integration that now rests in departments."

Microsoft's Ferrar reckons you can overcome the standards issue by moving data with Odata, authored by Microsoft, the W3C's RDF and the ePub format.

The real problem isn't formats, though: it's sheer volume and completeness of data.

For example, neither Amazon and Microsoft charge a penny for putting your data into EC2 or Windows Azure, but going the other way, there's a toll. Amazon's exit charge starts at $0.15 per gigabyte and the firm offers discounts for volume. Microsoft's charges start at $0.12 per GB. Microsoft doesn't say how many objects are sitting in Azure but Amazon – as we know – is home to nearly 100 billion.

Tom Hughes-Croucher, chief evangelist at cloud software fabric specialist Joyent – one of those West Coast companies recently visited by MP Maude – told me that it is bandwidth that is the real choke point and that the UK government should be really concerned about. Hughes-Croucher was previously an evangelist and technical lead at web giant Yahoo!

"Getting the data isn't a big deal," Joyent's man told The Reg. "The big deal is a cloud provider will let you trickle in data in gigabytes or terabytes for months at a time; then you hit a year and then you have no capability to move the project because there's no bandwidth to move."

The last war

Hughes-Croucher's former Yahoo! colleague, Facebook performance and efficiency expert Carlos Bueno, tag-teams with him on this subject in this slideshow. Both agree that it's the network latency that's the real lock-in factor on cloud.

"Data formats are like fighting the last war. There are enough programmers that any weird binary XML could be reverse-engineered pretty quickly. That's not a big deal in my opinion. It's not strategic lock-in any more," Bueno says.

To give you an idea of scale: it took Yahoo! four weeks using 700Mbps pipes to move 10-20TB of data from servers run by Xoopit, which Yahoo! bought in 2009, to its own servers. Moving 12TB between Amazon S3 and Windows Azure will take 40 hours, and between S3 and Rackspace will take just under a week, according to this lock-in survey on GigaOm.

More about

TIP US OFF

Send us news


Other stories you might like