There’s a trick we use at Freeform Dynamics when trying to figure out the true significance, if any, of the latest Big Thing being promoted by IT vendors and pundits. We ask ourselves what will be left when the marketers get bored with the current buzz words and move on to the next Big Thing, as they inevitably will.
As an example, consider Web 2.0. It is a term most of us quickly became sick of because it was so often used as a label to push product. When we teased it apart, though, it was clear that two underlying developments were important: the internet becoming a more interactive place; and web interfaces becoming a lot richer.
Those developments have had a big impact on how businesses exploit the web and the way in which developers design applications to run over it.
With service-oriented architecture, another term that has gone out of fashion, what stayed with us was software componentisation and open APIs as the norm when designing and building software.
Whether people are using service bus technology, publishing catalogues of business services and transforming the way they operate in the way the marketeers envisaged is secondary to this fact.
Road to nowhere
Coming to the present, we ended up with some interesting conclusions when we looked beyond the messaging and posturing that surround cloud computing.
We might baulk at the top-line claims of some of the evangelists who proclaim that we are heading towards a total shift from on-premise to hosted computing.
We might also grow weary of the messaging contortions of technology vendors attempting to explain how even the most mundane products can enable your “journey to the cloud”. An Ethernet port on a storage device does not a personal cloud make.
But unless you are totally closed to new ideas, you cannot help but recognise that there are some interesting developments underpinning all the cloud bluster. There have been some big strides in areas such as infrastructure virtualisation, software architecture and systems management over the past decade, and many of the lines between what is possible, what is practical and what is commercially viable have shifted.
Options that would not have been considered sensible for mainstream businesses five or 10 years ago because of their prohibitive cost or complexity are now there for the taking.
As an example, think back to a few years ago when we were all being bombarded with hype about grid computing, one of the Big Things of that era. Back then the idea of creating pools of compute power to boost hardware utilisation, facilitate smoother growth and cope better with fluctuations in demand was pretty neat.
So too was the notion of utility computing, in which compute cycles could be consumed on demand just like electricity. The problem was that a lot of the solutions and services were incomplete, immature or extremely expensive. You needed a lot of money, specialist skills and courage to go for it.
That old black magic
While a lot of the messages about cloud computing we hear today are arguably just grid and utility computing redux, the big difference is that IT vendors and service providers can now deliver on the promise.
Sure, we are missing some standards in places, and some of the licensing and subscription models are still, shall we say, a work in progress. But fundamentally it all works. Provided you select the appropriate solution for the problem at hand, bearing in mind your environment and constraints, you don't need black magic skills and stupid amounts of money to take advantage of it any more.
This brings us to one of the two significant things about cloud that will remain with us once the marketeers move on: increased choice.
If you have a requirement today for a new application, for example, you can elect to deploy it in a traditional manner on its own dedicated stack of hardware and platform software or on a private cloud that you have created by pooling server and storage resources in your data centre.
For the foreseeable future, all these forms of deployment will co-exist
You then also have various hosted-service options, from co-location through individual virtual servers or virtual private clouds right up to the software-as-a-service (SaaS) model, in which application functionality is delivered on tap.
For the foreseeable future, all these forms of deployment and delivery will co-exist, and the majority of mainstream organisations will mix and match them as needs dictate.
You might choose to keep your next ERP deployment in-house on dedicated hardware, for example, but resort to SaaS for your email and collaboration requirements because that makes it easier to deliver a rich experience to mobile and remote users while maintaining security.
Even then, you might elect to keep some office tools on the desktop rather than moving everything into a hosted environment.
Such an architecture illustrates the hybrid approach, in which you combine multiple delivery mechanisms within the same notional system. Another simple example might be keeping your email server in-house but using an online archiving service to ease the headache of long-term retention.
There is no absolute right or wrong. Instead, there are a lot more options that you can consider, based on functionality requirements, access needs, practical constraints, the type of data being handled and, not least, your preferences.
The consequence of this – and it is already happening in organisations that get the significance of increased choice – is a change in mindset within IT from deploying and managing systems to delivering services to the business. The focus of IT decision-making is moving from how capability is delivered to what is delivered.
The second significant thing that will remain after the cloud hype disappears is therefore the emergence of a source-agnostic approach to IT delivery, in which decisions are made according to business needs and constraints, rather than technology ones. ®
Dale Vile is CEO of Freeform Dynamics