The Desktop Computing era has brought computing power into the hands of the users, but left them still dependent upon IT to provision the back-end infrastructure such as networks, servers and firewalls. Upkeep on in-house infrastructure tends to be daunting and very costly.
What's more, catastrophe can result at anytime from drive failures, viruses, corrupt databases, server patches and the list goes on. You also need to pay for all the hardware and a team to manage it.
Since application servers tend to be driven by departmental budgets, IT infrastructures often finish up as over provisioned mishmashes of equipment, processes and technology entailing excessive cost and great inefficiencies with servers working at 15-25% of capacity. Cloud servers, on the other hand, run at 75-90% of capacity. This results in less office space, hardware, staff and power requirements saving a lot of money, and the environment.
Fundamental to the Cloud Computing argument is that software is rented rather than bought outright. Finance directors will straightaway draw a comparison between the two routes and present that after typically 2.5 or 3 years, the rental payments on exactly the same resources would appear to exceed the capital cost: it would consequently make little sense to accept a rental agreement.
While that break-point may be right at first view, Alex Parker of Commensus debates that there are noteworthy considerations to be taken into account. "It assumes that any equipment purchased is being fully utilized from the outset. If a company has acquired IT solutions with the capacity to take it forward three or five years, for example, it is paying for resources on which it cannot generate a return on capital. Changed circumstances may mean that the capacity is never fully taken up."
Cloud Computing offers the prospect of moving most IT expenditure from the balance sheet to the profit & loss account. This in turn removes capital expenditure, cutting operational expenditure and gives small firms the budget predictability they need. IT departments can then focus on the front-end issues that will enable organisation survival and growth.
With Cloud Computing, instead of making one capital commitment to buy the hardware and another to acquire expensive software, firms in effect rent both the hardware and the software, paying only for the resources that are actually employed. So you don't pay anything when services are not needed, doing away with unneeded overprovision of resources to provide for unpredictable spikes in demands. Businesses can go from 20 workstations to 80 and back to 50 again in the time it takes to authorize the online paperwork. This "pay-as-you-grow, save-if-you-shrink" model works out much cheaper in the long run.
In the past, it might take a business six to eight weeks to commission an application server. Now, computing power and storage space is becoming a commodity, purchased when necessary and scaled up when necessary. This dynamic resource management is enabling organisations to respond quicker to market shifts and acquire an advantage over their rivals. It is this agility and scalability that persuades most companies to venture into the cloud.
But Cloud Computing is more than an IT deployment. Moving into the cloud is a cultural shift as well as a technology shift. For IT staff, and particularly the chief technology and chief information officers, it requires a rethinking of their roles. 70% of time previously wasted on operational maintenance and upgrades is then available to spend focusing on business strategy. This allows a company to take advantage of new opportunities to innovate and grow.
The above guest post was by Jack Wilson of Commensus PLC, who specialise in Cloud Computing services for the UK.
# # #