The word ‘overhyped’ always comes up in conversation when discussing the Cloud and, being English and, hence, innately conservative and cynical, there is a natural tendency to view anything in vogue as overhyped. Indeed, those purveyors of wisdom, Gartner, even have ‘Cloud’ at the top of the ‘hype cycle’ and about to plunge into the ‘trough of disillusionment’ that awaits all technology innovations that, at least in the short term, fail to live up to much vaunted expectations.
For our friends across the ocean the Cloud is very much an established principle and they are just looking for ways in which they can leverage it effectively. But even in the US it is generally accepted that you have to dig a little deeper to get through the hype. And without a doubt there is a lot of hype around the Cloud at the moment but in order to appreciate whether something is ‘overhyped’ it is necessary to understand the potential impact and benefits.
Go back a few years and I recall working with a household name that had thousands of servers and an average percentage CPU utilisation in the single digits. Any new project had to have its own hardware, so a lot of projects didn’t get the go ahead as the infrastructure cost was too high. This was particularly true in those cases where additional hardware would require an expensive update to the data centre or, more significantly, a new one.
It is now possible to resolve this dilemma through virtualization technology and this has proven to be the key factor in turning such a server farm into what is now a ‘Private Cloud’ by isolating virtual servers via a software rather than hardware route. The benefits of this approach are clear – underutilized resources can be allocated to new tasks that require those resources, thus significantly increasing server utilization and reducing the unit cost attributable to each project. The ultimate benefit of this, apart from cost reduction, is that more projects have a viable return.
But a lot of the real excitement (and hype) around the Cloud centres on ‘Public Cloud’. These are third party virtualized server farms that enable compute resources to be accessed in a truly ‘on demand’ model. The innovator in this area has been Amazon Web Services (AWS). They provide as little or as much compute resource required on an hourly rate. So, as little as a single ‘server hour’ can be purchased for a few pence. Or as much as hundreds or thousands of servers can be purchased for years. Or anything in between. And it is not just the hardware that can be purchased on this subscription model. Increasingly software is offered on this basis as well and there will be considerable pressure on software vendors to move down this route. The Public Cloud offers immediacy, elasticity and scalability in a manner that all but a very few organisations would struggle to replicate.
It takes a while for the implications of this to sink in but let’s think of the benefits available:
- You can set something up without Capex or significant commitment
- You can do it immediately
- It is in someone else’s data centre, so no power, cooling or support resources.
- If you want to try something you can spend just a few pounds to do so
- You can start small for development
- You can scale out test
- You can deploy rapidly
- You can elastically scale to meet demand peaks and troughs
- You only pay for what you use
Projects that didn’t have a business case associated with them become viable. Projects can be terminated quickly without leaving an infrastructure legacy. Data centres need not expand. ‘Big data’ problems can be solved cost effectively and quickly making them accessible to all sizes of organization. The list goes on.
Consider this a little more and you realise that the environment can facilitate a paradigm shift in the whole way in which IT services are provisioned and supported as the risks associated with the process are considerably reduced, to the extent that IT can focus on improving the performance of the business rather than managing the overhead of hardware and software infrastructures. IT and Business agility becomes a reality rather than a methodology buzzword!
Many IT functions consider the Cloud a considerable threat and in some ways they are right. The threat to traditional ways of operating is considerable and adaptation to a new way of working will prove painful to some but will be a significant opportunity for all to reap the rewards a considerably more flexible model can provide.
So, in answer to the original question, the Cloud is certainly hyped currently, possibly overly so given the current adoption, but those companies that embrace it will find that there is, indeed, a new paradigm awaiting them.
This blog was written for us by John Coppins, Senior Vice President, Kognitio Cloud.
With over 20 years experience working with Information Management solutions, including 12 years business intelligence, data warehousing and OLAP knowledge, John has extensive experience within many industry sectors, including chemical, financial services, retail, FMCG, manufacturing and utilities.
Specialising in Business Intelligence consultancy and software, John is familiar with many Business Intelligence tools and platforms having solved a range of business problems from small, tactical issues to creating complex, enterprise class strategic solutions.
John is now focusing on trying to sort out the confusion that is currently the Cloud by delivering a world class in-memory, analytical platform in the Cloud for the leading in-memory analytics company Kognitio.
If you would be interested in writing a blog for one of our future newsletters please get in touch at email@example.com