Trendy news on Computing Technology
One of the most interesting parts of working in IT is the rapid pace of change. Tools are constantly evolving or being developed to meet new business needs, and what is at the heart of your business one year might be considered legacy just a few years down the road.
It’s hard to believe that that might ever happen when it comes to the cloud. It is increasingly the backbone of modern business, used for everything from lowering costs to increasing agility.
Despite its rapid entrance into our work lives, though, modern cloud computing has only existed since the mid-2000s. In its early days, most companies were moving to the cloud for cost reasons – and while costs scaling with consumption is still a big factor, there are others that are just as, if not more, of a draw.
The cloud increases agility and flexibility. It makes collection and analysis of massive data sets feasible; connects systems, people and customers; increases response rate and lowers time to market. In short, companies in the cloud have a significant competitive edge over their competitors.
Evan Klein, head of product marketing at hybrid cloud management platform Scalr (a sponsor of Computing‘s Cloud & Infrastructure Live event this week), said that businesses often use the cloud differently depending on their size:
“SMEs, due to the nature of their smaller infrastructure footprint and, in some cases, lack of IT capabilities, have found that cloud services have allowed them to concentrate on their core business. The adoption of SaaS applications, which do not require the up-front investment and ongoing costs for infrastructure, increases their bottom line and improves business agility.
“For larger enterprises, the legacy of building physical data centres and heavy investment in infrastructure means cloud adoption has needed to be phased in. This usually starts with private cloud environments…[but] public cloud adoption has started to increase recently, with enterprises using public cloud for areas such as test and dev and disaster recovery.”
Many firms, especially those large enterprises that began with on-premise private clouds, are now moving to hybrid and multi-cloud environments. This isn’t only an issue of cost, but of feasibility. Applications on legacy infrastructure are notoriously difficult to migrate to the public cloud, and some simply aren’t built to work in that environment: they need to be re-factored before a firm can even begin the migration process.
However, most companies will eventually find themselves operating in the hybrid cloud, if only because different teams adopt technologies at different rates. This can add its own challenges, like decentralised management and network complexity.
One answer is the idea of a ‘hyper cloud’: an abstraction layer that sits above multiple cloud providers. Partly mitigating the problem of decentralised management, companies can use a hyper cloud to monitor cloud usage; set controls for the type of infrastructure that is provisioned; set corporate policy, security, and compliance controls; and gain visibility into costs.
However, most current solutions lend themselves poorly to abstraction. “Rather than thinking about whether everything should be abstracted or not, it may be better to break down that thinking into what lends itself well to abstraction and what does not”, said Klein. “For example, service authentication does lend itself well, as does the library/SDK layer. That is one of the reasons infrastructure as code has been so popular.”
Security is often perceived as a challenge of using multiple clouds, but Klein sees the association of decreased security in multi-cloud environments as a fallacy; risks are more often the result of bad processes, which itself stems from insufficient visibility.
The GDPR, which makes data sovereignty such an important part of any data strategy, is also increasing the importance of visibility. The regulation is especially an issue for large enterprise-scale firms, with multiple data centres and a global presence. Control of data is crucial in such an environment.
“If organisations require data sovereignty it is important that the provider used has the resources to offer local retention of data. If there is no control over the users’ access to the cloud service providers they choose, there is a danger that data can be stored outside of the designated region.
This can be mitigated by using a private cloud infrastructure, but for those using public cloud infrastructure it can also be mitigated by controlling access to the data centers that are used by public cloud providers.
“The number one problem companies face here is lack of visibility into what developers are doing. Without that, they cannot have control, much less compliance. Effective management of cloud environments can dramatically reduce the risk of data being mismanaged and enterprises falling out of compliance.”