Cloud Insights: The Most Common Myths about the Public Cloud (part 1)

Cloud is not a new topic. This nebulous buzzword has been with us since the mid-2000s, and the idea behind it goes back even further. Today, it is considered indispensable for digital transformation. Against this backdrop, ...

BLOG - March 2021: For many people, cloud is still difficult to fathom. But what are the most common myths, and what is the real deal?

... one might think that it has long since arrived across the board and is no longer a mystery. But hand on heart: do you have a spontaneous definition at the ready? We can reassure you that most people don't. In the course of their daily work, our consultants are confronted with a number of misconceptions and pitfalls, which we have collected and scrutinised. The answers to the most common myths about the public cloud are presented here.


Do you want to read the full paper at a later time? Download part 1 and 2 as a PDF

Request the full paper on Cloud Myths

The most common myths about the public cloud (part 1 of 2)

1. The public cloud is more expensive than an on-premise solution

Companies that compare their ongoing infrastructure costs with the prices of public cloud providers are often initially disappointed because direct savings potential is not always clear at first glance. However, a closer look reveals various cost-cutting levers with which you may be able to save money.

A tip: Avoid migrating your previous locally operated on-premise solution into a hyper scaling scenario via lift-and-shift.
In this context, lift-and-shift is a method in which your previous design is retained 1:1 in the cloud. As tempting as this approach seems to be due to its simplicity, in most cases the savings are zero. It is crucial that you do not act hastily and take the time to develop a comprehensive strategy in advance. An important keyword here is: Total Cost of Ownership, or TCO for short. This means, when weighing up, you should consider the total operating costs – for the entire project and over its entire lifetime.

Starting with the technical point of view, it is about the rightsizing of the machines: a local infrastructure is usually planned for 3-5 years and thus often oversized. When moving to the Public Cloud, you can correct this and dynamically adjust your capacity at any time.

Ask yourself whether you need the instances all the time, 24 hours a day. In particular, development, testing and production environments are suitable for on-demand use: you only use them when needed and can switch them offat night or at weekends, for example. This means that you only pay for what you actually use, eliminating the cost of compute and RAM. Savings of more than 50% can be achieved on the resources concerned.

The implementation of projects is often mainly about providing the necessary infrastructure. This can take a long time on local solutions. Through automation mechanisms such as Terraform, Puppet and Ansible, cloud infrastructures can be deployed within minutes and costs from external service providers can also be reduced.

Are there possibly services that you can obtain more cheaply as a Platform-as-a-Service, email services, database services? A classic here is Exchange Online, because in very few cases is it advantageous to continue to operate your Exchange Server 1:1 in the virtual machine.

New perspectives of cost savings also arise from new purchasing models: most providers offer various options to purchase instances at particularly favourable conditions – for example, with AWS Reserved Instances, Spot Instances, or Azure Low Priority Virtual Machines. Virtual instances can be rented permanently at a fraction of the list price. Depending on the term you choose, 1 year or 3 years, the discount increases significantly.

Last but not least, we can expect steady reductions for hyperscalers: the three major public cloud service providers are competing fiercely for the best prices. In the recent past, they fell by double-digit percentages each year. This trend is expected to continue.

As you can see, the whole TCO analysis includes a lot of aspects that we don't think about at first. Additionally, there are costs for licenses that are often already included in the public cloud with a SQL server, for example. But you can also save on classic resources such as electricity, air conditioning or space. It may be worth your while to take a closer look.

So our final answer is: Public Cloud is not always an expensive model. It depends on your concept and the individual parameters you use.

2. Data is more secure in my local environment than in a public cloud

Security concerns like this are still top inhibitors when choosing the public cloud. Bear in mind: the cloud providers know this too. Their most important asset is your trust, their existence depends on securely hosted data. Their investments in IT security are correspondingly large. Microsoft, for example, has its own Security Operations Centre and holds more than 90 compliance certifications. 50 of these apply specifically to global regions and countries, and 35 compliance offerings are adapted to the needs of key industries such as healthcare. These include DSGVO, C5, ISO 27001 and CSA STAR, among others.

Microsoft_Azure_Compliance_Offerings_EN

Source: azure.microsoft.com/mediahandler/files/resourcefiles/microsoft-azure-compliance-offerings/Microsoft%20Azure%20Compliance%20Offerings.pdf


Certificates are very expensive. As a result, most companies are unable to obtain certifications in the quantity and quality in which large cloud providers do.

Hyperscalers do a lot to help you with data security, white papers, best-practice guides, or comprehensive interactive security reference architectures like this from Microsoft:

Microsoft_Cybersecurity_Reference architecture

Source: www.microsoft.com/security/blog/2018/06/06/cybersecurity-reference-architecture-security-for-a-hybrid-enterprise/


The efforts of cloud providers are also reflected in the Cloud Monitor 2020. Many public cloud users confirm an increase in data security. Overall, they registered more security incidents on their own IT than in the public cloud. This is partly due to the fact that cloud providers are subject to strict data protection controls and policies, and regularly undergo external independent audits. The provider’s teams are well trained, have a 24-hour view of any infrastructure threats and respond to them. In addition, distributed storage systems and redundancy systems that ensure availability and performance contribute to greater data security.

On closer inspection, the fact that the data on one's own systems is more secure is therefore completely unfounded. The opposite is likely to be the case.

3. Once the data is in the cloud, the company loses its data sovereignty

The vendor lock-in effect makes many companies cautious, because the opinion often prevails that once a company has decided on the cloud, it is also bound to the provider for all time.

However, the opposite is the case. Data sovereignty remains with you without restriction. And thus also the topics of data management: backup, disaster recovery, legally secure archiving and compliance. Even if best practices are provided to you by the providers, you are still responsible for your data.

Nowadays, all data from public clouds can be easily exported or migrated again. However, you should note that exiting is often more expensive than getting started. If the worst comes to the worst, it is best to work out an exit strategy right at the beginning that takes this aspect into account.

Now you might still ask yourself what the situation is with your data in the public cloud if the providers' servers are located outside Europe.As we know, under the US CLOUD Act (Clarifying Lawful Overseas Use of Data Act), among other things, they are required by law to provide US authorities with access to all data that the company retains. With the right design of your architecture, you won't lose your data sovereignty in this case. With encryption components, you can make them secure so the hyperscaler itself doesn't have access to them, such as Microsoft Azure Key Vault. This means that even if your data was shared, the US authorities lacked the appropriate key to exploit it.

Our conclusion: We can't attest to 100% security, but your data is definitely no less secure than in your own data centre, provided the design is correct.

4. If I have a problem with my infrastructure, I will set up a support ticket with the provider

As part of your service level agreement with the cloud provider, you are generally entitled to open support tickets. Depending on which type you post, the service includes different features. But they all have one thing in common: they deal exclusively with questions about the configuration of your infrastructure and their availability. For example, if you notice a malfunction with a PaaS service such as Exchange Online, you will get the support you need.

However, the situation is different for company-specific problems with your Active Directory or a specialist application. In this case, your ticket will most likely be closed unprocessed, because the cloud providers are not available for these kind of problems. You are still responsible for their error-free operation.

The rumour that hyperscalers take over the patching of instances also persists. Providers typicallyonly patch the base images of your instances, not running instances. For SaaS applications however, this statement is true.

Read more about common Cloud myths in part 2 of this blog (published next week).