Cloud technology has become so seemingly ubiquitous at this point it can be difficult to recall what a revolutionary idea it was when it first came to be. Often, purchasing enterprise software also meant adding one or more expensive servers that would never be used at full capacity to your datacenter. If you built hosted applications yourself or opened access to your information outside your organization, you’d need to invest in a far more robust infrastructure whose running costs increased as adoption increased.
Many of you reading this may have warily side-eyed the door to your own datacenter as you read that. Though Cloud technology is now well-established and widely adopted, there’s still quite a bit of innovation left to realize as more organizations continue to adapt their strategies and infrastructures to make the most out of this foundational technology. In fact, of the more than 600 respondents to TIBCO’s 2018 CXO Innovation Survey, 41.9 percent named “replacing legacy technology” as a key tactic as they approach their innovation strategy. Adapting to the Cloud is one of the main ways organizations are accomplishing this evolution.
The Perception of Cloud Security is Improving
Running your own datacenter provides a certain sense of security, even when the actual security of those services is weak. Many large organizations in traditionally conservative industries, like finance and government, waited to adopt the Cloud for fear of losing some amount of control. At the end of the day, the Cloud is simply a set of servers hosted on someone else’s machines. A common fear in the early days of Cloud adoption was that those servers, which by definition must be available over publicly accessible networks, would be ripe for bad actors to hack and steal data.
Though there have been some well-publicized reports of such incidents, in a majority of those cases the cause was not unique to the servers being in the Cloud. Poor application design and lackluster data security will cause exploitable vulnerabilities wherever the servers are hosted, and many data breaches stem from local copies of databases stored on laptops stolen from employees.
With 29.52 percent of survey respondents listing “security concerns” as a barrier to effective innovation, it would seem most organizations are realizing that, while a minority of Cloud technologies might present a potential security concern, it is no longer as serious a barrier to Cloud adoption.
Cloud Changed the Cost Equation
Perhaps the biggest advantage of adopting Cloud technologies is the cost savings of providing web applications and services at scale. Most Cloud infrastructure providers have adopted a “pay for what you use” model for their servers. Web applications that have been “lifted and shifted” directly to the Cloud have not realized the same cost improvements as those using modern approaches like containerization, microservices architecture, and functions as a service. This is largely because those older applications were not optimized to make the most of a single server’s available compute power. For some of those applications, an entire server instance needed to be spun up for what amounted to a rather small computing footprint.
When it comes to dynamically scaling operations as demand increases, even for short bursts of time, the Cloud has proven itself as a time and cost saver. Rather than purchasing additional servers that idle in the datacenter waiting for burst traffic but, otherwise, going largely unused, Cloud infrastructures allow organizations to rapidly adapt to short-term, high-demand traffic. For example, retailers can adapt to increased holiday traffic and rapidly spin down and stop paying for those services when the demand subsides. While this is wonderful for cost control during planned periods of increased demand, it can also save a business when unexpected demand drives an unexpected flow of traffic to servers. Rather than returning a series of server errors, organizations can quickly spin up a few more server instances to address the demand in a matter of minutes.
Cloud May Be The Foundational Technology
Along with “adopting solid business intelligence (BI) and analytics practices” and “establishing strong application integrations processes”, Cloud technologies were identified by survey respondents as the top foundational technology for innovation. Considering many of the most popular BI and integration packages on the market today are managed online, Cloud technology may be the main foundational technology as approach your innovation strategy.
As always, though, the technology is only one part of the process.
Adopting a Cloud approach to host vendor-provided software with your own applications and services requires a cultural shift in both your IT department and throughout your entire organizations. While IT used to be in charge of a majority of technology purchasing decisions, the Cloud makes it possible for non-technical subject matter experts to have greater control over the technology they use. IT’s role, therefore, must shift from that of gatekeeper to consultant, helping business users leverage the available technology and become more independent. With the right processes and education in place, every member of your organization can contribute to the success of your company’s innovation strategy.
To learn more about how your company can drive innovation and disruption with Cloud technologies, read the complete TIBCO 2018 CXO Innovation Survey.