Comment Desktop virtualisation isn't a panacea that will slay infections and slash costs - it's a problem-filled journey that may not even solve the problems you started out wanting to fix. That's one lesson attendees at IDC's Desktop Virtualisation Conference 2010 in London learnt.
The easy desktop virtualisation pitch is that you replace physical PCs and laptops, each with their own version of Windows and applications to be maintained and putting a costly load on helpdesks, with virtual desktops running on hopefully cheaper hardware and using centrally-maintained and standardised O/S and applications.
That way you get out from under the estimated $2,000 a year cost of having a real PC on your users' desks and replace it with a virtual one delivered from storage and servers in the data centre. O/S migrations are a snap and untroubled users' productivity soars. However, reality can differ.
There are various problems, and the more individual the users are the greater the issues. First of all, taking several hundred users and putting their software environments into the data centre puts a huge extra burden on the data centre's processing, storage and networking environments. Alex Tanner from EMC's Global Services said that EMC found it may be useful to put bootable images on storage area network (SAN) storage, perhaps using its own EFDs (Enterprise Flash Disks) for their read speed, and user data files on filers such as EMC's own Celerra.
The last thing users will accept is a lower performance level than they are used to, so the data centre storage and servers involved must be fast enough and have the capacity needed to deal with the load. Tanner said that is often necessary to use storage techniques such as thin provisioning and deduplication to get the storage capacity costs down to an acceptable level. In other words, you need sophisticated data centre storage processes to underpin desktop virtualisation.
Res Software's Grant Tiller talked about the need to copy existing users' desktop environments onto the virtualised desktop and his company's PowerFuse software can do that, adding another layer of software to the desktop virtualisation stack.
This stack is costly and complicated enough for Tanner to say that if the starting point for desktop virtualisation was a wish to reduce desktop PC hardware and software costs then that might not be achieved. It's better to focus on lowering desktop admin and support burdens, and reducing the number of desktop outages, as that's where the total cost of ownership (TCO) reductions will come from. This makes a positive outcome more uncertain as the level of future benefit is a gamble.
Desktop virtualisers need to find a sub-group of their users who are good candidates, possibly using a consultancy organisation to do so, clarify and define their goals, and work with all company stakeholders involved, such as the help desk organisation, to ensure they are on board and supportive. Desktop virtualisation brings its own set of problems, which need to be carefully understood and dealt with. The whole thing is a major project which takes a long time and can go wrong.
There is no instant fix to a desktop PC admin and maintenance nightmare - instead it's more like planning a sustained and complicated treatment protocol for a patient that will hopefully restore health, but may not. The formulation and targeting of the treatment are absolutely vital, and neither simple nor cheap. ®