There’s no question that Public Cloud services are continuing to build momentum and acceptance in organisations throughout the world. In fact, global spending on cloud infrastructure should approach US$16.5 billion in 2015, according to Gartner*. Local investments such as the introduction of VMware’s vCloud Air platform in Australia, alongside other recent openings of onshore Public Cloud solutions from heavyweights such as Microsoft and other large-scale providers, have reinforced the idea that Public Cloud is becoming mainstream and is no longer the risky proposition it was in the early days.
Despite this though, there is still a significant reluctance by businesses to move their full production environment or mission critical applications and workloads to the cloud. Instead a Hybrid Cloud approach is more common, with production workloads kept on-premises or in Private Clouds, with the other less critical services like test and dev environments on Public Cloud.
A closer look at the core uses of Public Cloud find 3 types of workloads are the most common:
These particular workloads are all highly suited to the benefits offered by Public Cloud such as agility, scalability, flexibility and in some cases cost. However, these benefits would also potentially suit production workloads – so why the hesitation?
An ITnews poll from last year on the reasons for delaying the adoption of Public Cloud listed the top 3 reasons as:
These reasons though are based on more traditional limitations of Public Cloud solutions driven by offshore data centre locations or shared connections to the data centre. The launch of in-country Public Cloud services from vendors such as Microsoft and VMware has alleviated these concerns.
Application integration issues have also largely been addressed given that Public Cloud workloads can now be configured as an extension of the on-premises environment.
So that leaves data. One of the concerns that the Data#3 team hear from our customers is regarding the size of the data sets linked to production workloads. Without drifting onto the topic of big data, the collection, storage and manipulation of large data sets has been difficult in the past with Public Cloud.
This has been primarily due to bandwidth issues however, with dedicated, secure, private connections and local Australian data centres, this issue now causes less concern. This has meant organisations with datasets and workloads that fluctuate in terms of load (such as spikes in compute, memory and disk for end of month processing) are now considered good candidates for Public Cloud.
Another example that comes from our customer conversations is when a large data set originates outside the company firewall. This could be sensor data from field-based equipment for utilities or mining organisations. In these cases it actually makes more sense to have it all collected and stored in a Public Cloud where it can grow and scale more easily rather than brought in-house. This separation of the data in the cloud and the application on premises is helping organisations “test the waters” ahead of further cloud based consolidation.
After all these considerations though, we see the biggest barrier to cloud adoption is the change itself – change is always difficult. However, with the right guidance, Data#3 customers can benefit from lower use risk cases that allow testing of new operational procedures prior to reaching a tipping point for “cloud first” adoption.
* Gartner (2015), Gartner Says Worldwide Cloud Infrastructure-as-a-Service Spending to Grow 32.8 Percent in 2015, accessed May 2015 from http://www.gartner.com/newsroom/id/3055225