Banner of Private cloud - a computing weather-machine!

Disclaimer: Kindly be advised that this blog post predominantly centres around the ever-evolving realms of law and technology, with a specific focus on the Scandinavian perspective. It is of utmost importance to keep up to date of the latest advancements in both these domains, given their interrelated nature. Therefore, exercise awareness when arriving at hasty conclusions based on this blog post (and any other source on the internet for that sake) and ensure a comprehensive grasp of your local legal and technical landscape.

Note: The terms ‘private cloud’, ‘private cloud infrastructure’, ‘private cloud computing’, and ‘private cloud environment’ are used interchangeably throughout this blog post. All referring to the same concept.

When it comes to cloud computing, private cloud infrastructure is becoming increasingly popular for companies that are looking for a secure and cost-effective solution to their data storage and processing needs, and there is even a movement away from the public cloud (see Why we’re leaving the cloud and Why Is Cloud Migration Reversing From Public To On-Premises Private Clouds?). For companies operating within the EU, particularly with regard to privacy regulations (GDPR), private cloud environments offer compelling advantages.

Whether your interests lie in business or technology, you may find yourself intrigued by the concept of constructing private cloud environments. In this blog post, will I delve into the realm of private clouds and touch upon the system Kubernetes. My focus within the post is primarily on the “why”, while the “how” my be explored in a future post. I will explore the advantages of private clouds over public clouds, but keep that public clouds have it’s own set of benefits which may be the subject of a future blog post.

The rise of cloud computing

The term “cloud computing” was coined in 1996 by Compaq in an internal presentation, and the the talk on cloud really kicked of on August 25, 2006, when Amazon Web Services (AWS) launched its Elastic Compute Cloud (EC2) service to the public. This service allowed people to rent virtual computers and utilize their own applications on AWS’ infrastructure. It marked a significant milestone in the history of cloud computing, paving the way for other cloud service providers to enter the market. Since then have the number of suppliers grown (mainly based in US and China) and the offerings become increasingly more advanced and popular, finally the COVID pandemic made the term ‘cloud’ mainstream. But we should not neglect that the concept of cloud computing goes back to the days of the mainframe (everything goes in circles).

The market for cloud computing providers is constantly evolving, with major tech players like Microsoft (Azure), Amazon (AWS), Google Cloud, and IBM Cloud leading the way. These companies, all based in the United States (US), heavily invest in software and infrastructure, both organically and through acquisitions. To meet the growing demand for storage and computing power and reduce latency, new data centres are sprouting up worldwide. It’s a dynamic landscape where innovation knows no bounds. These companies offer a service known as public cloud computing, providing easily accessible computing resources for everyone to use. All you need is an account and a credit card to get started.

“A cloud” can be described simply as a data centre filled with identical hardware that is never physically accessed, except for initial setup and disposal when necessary. Throughout its lifespan, every deployment, update, investigation, and management task is automated, ensuring a seamless and efficient process.

Three reasons for companies to do cloud computing

There are numerous reasons why companies are drawn to cloud computing. In my opinion, the three most significant are as follows:

Firstly, cloud computing offers substantial cost savings, both in terms of hardware and personnel. The idea is that you no longer need your own servers or staff to maintain them. Instead, you can simply swipe your credit card and consume the cloud. However, this is only true if you have personnel who are knowledgeable about the cloud and how to utilize it effectively.

Secondly, the cloud provides scalability and flexibility that traditional computing cannot match. Clouds consist of interconnected hardware, often distributed across different locations. The cloud system makes it easy to consume computing resources based on demand, which is particularly beneficial for businesses with fluctuating workloads.

Finally, the cloud enables organizations to launch products and services more quickly. This is achieved through ready-to-use services and resources, as well as by creating a layer of abstraction in relation to servers and hardware.

To fully harness the benefits of these reasons, certain preconditions must be met. One of the most obvious prerequisites is a comprehensive understanding of the cloud throughout the organization. This understanding is essential for leveraging the cloud’s full potential.

Sadly not even great computing power comes without great responsibilities.

With great powers comes great responsibilities - source: https://giphy.com/gifs/MCZ39lz83o5lC

The home of the big public clouds (AWS, Azure etc.) are within the US and this can lead to problems in regards to personal data and transferring data from the European Union (EU) to the United States (US) presents significant challenges for companies, particularly when it comes to complying with stringent data protection regulations. The EU’s General Data Protection Regulation (GDPR) mandates that companies ensure the highest levels of privacy and protection for user data. However, transferring data to the US, where privacy laws are less strict, poses a risk to compliance. The EU-US Privacy Shield, which was designed to facilitate safe data transfers, was invalidated by the European Court of Justice in 2020, leaving companies uncertain about the legal aspects of data transfers. Moreover, the potential exposure to surveillance by US intelligence agencies raises serious concerns for European companies. These issues necessitate careful planning and robust security measures when implementing cloud solutions that involve data transfer between the EU and the US.

Max Schrems, an Austrian lawyer and privacy advocate, has played a crucial role in shaping data transfers between the EU and the US. His legal challenges against Facebook, alleging that the personal data of EU citizens was at risk due to Facebook’s data transfer practices, have resulted in significant court decisions. Notably, his case led to the European Court of Justice invalidating the Safe Harbour agreement in 2015, followed by the invalidation of the EU-US Privacy Shield in 2020. These agreements had previously allowed for relatively unrestricted data transfers across the Atlantic. Schrems’ work has underscored the need for stronger data protection measures, prompting businesses to reassess their data transfer practices and explore alternative solutions to ensure compliance with EU regulations. It has also highlighted the contrasting approaches to privacy and data protection between Europe and the United States.

In December 2022, the European Commission launched a third attempt to establish a legitimate framework for data transfers from the EU to the US, which was approved by the union in July 2023, despite previous efforts such as the Safe Harbour agreement and the EU-US Privacy Shield being invalidated by the Court of Justice of the European Union (CJEU). The new initiative, known as the EU-US Data Privacy Framework, aims to address these concerns, and was approved by the European Commission on 10th of july 2023. However, Max Schrems and his organization NOYB express skepticism, arguing that it does not adequately resolve the issues surrounding US surveillance. Legal challenges are expected, with the potential for another CJEU ruling. As a result, businesses are advised to prepare for ongoing uncertainty regarding data transfers between the EU and the US.

Dealing with these legal challenges can be a real hassle for EU-based companies hosting data. However, this headache is not limited to EU-based companies alone. US-based companies also encounter similar issues as soon as they start hosting personal data of EU citizens.

Besides GDPR are future EU regulations making it even more challenging to buy in on public clouds, where the location and control of data is unsure. With the Cyber Resilience Act (CRA), Directive on Security of Network and Information Systems revison 2 (NIS2) and the The EU Cybersecurity Act, are the requirements for building and running software growing to new heights.

Private cloud: One strategy to navigate the infrastructure maze!

In addition to the public cloud, there are two other common types of clouds: private clouds and hybrid clouds.

A ‘private cloud’ is a form of cloud computing that offers similar benefits to the public cloud, such as scalability and self-service, but with a proprietary architecture. Unlike public clouds, which serve multiple organizations, a private cloud is dedicated to the specific needs and objectives of a single organization. The infrastructure of a private cloud can be hosted either on-premises or at a third-party site (often called co-located or managed).

Utilizing a private cloud can address several of the legal concerns mentioned in the section above. With a private cloud, you have control over the location and accessibility of your data, as well as the components included in your cloud infrastructure. This empowers you to make informed decisions while ensuring data security and compliance.

A ‘hybrid cloud’ combines the best of both worlds. It involves connecting your private cloud with a public cloud. For example, you may choose to store data in your private cloud and perform anonymized computation in public clouds. VPN connections are used to seamlessly integrate the two types of clouds, making the separation invisible.

Most public cloud providers also offer a hybrid technology called Virtual Private Cloud. It allows you to move some of your computing resources out of the public cloud, but it should be noted that it is only virtual and does not provide the full functionality of a private cloud.d.

Affordability: the compelling case for private cloud computing.

Public cloud services often employ a pay-as-you-go pricing model, which can prove advantageous for businesses seeking flexibility and scalability without significant upfront investment. However, costs can rapidly escalate due to increased data transfer, storage requirements, and high-demand compute resources. Furthermore, premium features like enhanced security, data redundancy, or tailored support can contribute to overall expenses.

Many have found that operating on public clouds can be notably more expensive compared to managing their own hardware (eg. Total Cost of VM with Block Storage and Bandwidth – Public Cloud vs Managed Private Cloud and Hidden Costs of Public Cloud: In-Depth Analysis of Public vs. Private Cloud TCO (note: Daniel Virassamy is representing Mirantis which is an Franch private cloud provider)). Additionally, they may encounter performance issues that, although solvable, necessitate the purchase of more costly hardware from public cloud providers. It often seen that when you scale your utilization of services in public clouds, is the cost growing fasten than the value of the performance. Often the term “Cost at scale” used to describe the uncontrolled cost which can happen in public clouds.

In contrast, private clouds require a substantial upfront investment in hardware, software, security and setup. Nevertheless, these costs can be more predictable in the long term, as businesses retain control over resource allocation. However, expenses related to maintenance, upgrades, and staff training should not be underestimated. Furthermore, since the entire infrastructure is owned and managed by the business, they bear the full cost of any underutilized resources.

The political enigma of the cloud

No doubt that the world have changed over the last years, both on regards to technology, also legislation and world pace (at time writing is the doomsday clock at 90 seconds to midnight). So maybe it’s time for all of us to look over our suppliers and consider if is the right choice, and maybe consider there is a local alternative to our global partners.

The fact there is no real European alternative to the big US cloud providers puts European organizations in a dilemma, as they want the ease that the public cloud provides while they need to comply the EU legislation. The EU is working on a cloud initiative, known as Gaia-X, this is a quite ambitious project that aims to create a competitive, secure, and trustworthy data infrastructure for Europe. The initiative was launched in 2020, and it signifies Europe’s endeavour to reduce dependency on American and Asian IT giants and strengthen digital sovereignty.

The goal of Gaia-X is not to build a cloud provider. The goal is to create federated system linking many cloud service providers and users together in a transparent environment, making it easy exchange data while keeping control and compliance. This is at lest the goal.

Another delicate issues with the non-EU-based cloud providers are their contribution to the EU tax system, or maybe rather the lack of same. In general putting tax on cloud services are a hard nut to crack. Most non-EU tech companies choose Ireland as their European headquarters is due to the country’s attractive corporate tax rate. With a rate of just 12.5%, one of the lowest in the European Union, Ireland provides a financially beneficial base for operations. Besides (some might say “because of”) heavy investment in infrastructure all over EU, does the local divisions of the these cloud providers over and over show red numbers, and therefore don’t need to pay any tax and contribute to society. One can ask itself why the effort then they are loosing money?

A possible technical structure of a private cloud

To create a thriving private cloud environment, there are numerous technology options available. However, when it comes to cloud computing, Kubernetes (also known as K8s) has risen to prominence as a highly favoured platform.

Kubernetes (see The Illustrated Children’s Guide to Kubernetes) is a platform that allows you to automate the deployment, scaling, and management of containerized applications. It was originally developed by Google, but now open-source and serves as a graduated project under the Cloud Native Computing Foundation (CNCF).

Kubernetes, often regarded as the ‘operating system for the cloud’, transforms the cloud infrastructure into a single cohesive system. Just like an operating system manages the hardware and software of a computer, Kubernetes coordinates and harmonizes the diverse services and tools in a cloud environment. From handling networking between containers, distributing resources, maintaining storage consistency, to scaling and healing applications, Kubernetes essentially provides a unified interface for cloud-native development. By abstracting the complexity of underlying infrastructure, it allows developers to deploy applications seamlessly across various cloud environments, reinforcing its reputation as the ‘operating system for the cloud’.

The capabilities of Kubernetes can be expanded by incorporating various types of workloads, often accessible as open-source projects. This enables the replication of functionalities found in public clouds, such as diverse forms of Platform-as-a-Service (PaaS) and Function-as-a-Service (FaaS).

At the core of Kubernetes lies the fundamental concept of software developers encapsulating their applications in containers. These containers contain all the necessary details required for running the software. Kubernetes, in turn, assumes the responsibility of orchestrating these containers, ensuring they run according to the specifications outlined in descriptive files. This process is known as container orchestration.

Kubernetes consists of two main components: a control plane and worker machines, also known as nodes. Together, they form a Kubernetes cluster. While it is possible to have a single-server installation for internal testing and development purposes, it is recommended to distribute the control plane across multiple servers and have more than one node. This approach enhances reliability, scalability, and overall performance.

To host a Kubernetes cluster, you have a couple of options. You can set it up on your own server, but this comes with the hassle of managing hosting software. An alternative is to find a local (co-located) provider that specializes in Kubernetes clusters. Additionally, many public clouds also offer Kubernetes instances. One of the remarkable features of Kubernetes is its standardization even though it comes in different flavours, which offers the flexibility to select any preferred supplier while avoiding the risk of vendor lock-in.

It is important to recognize that Kubernetes is a sophisticated system that requires diligent attention and careful management. Furthermore, there is presently a significant need for proficient expertise in Kubernetes, which can present challenges in its adoption as it introduces additional costs upfront for recruitment or upskilling.

Conclusion

Public clouds are not inherently bad; they simply represent one of many options available. Above the public cloud lies the sun, obscured but essential, with an invisible veil of vendor lock-in in-between. It is crucial to consider this hidden barrier before fully committing to the public cloud. Moreover, storing personal data of EU citizens in the public cloud poses challenges.

By harnessing Kubernetes to construct a private cloud, you can establish a secure and scalable infrastructure for your company’s data and applications. This approach allows you to address many of the concerns associated with the public cloud. However, before immersing yourself, it is vital to be cognizant of the advantages and disadvantages of any given technology. This holds true for private clouds as well, as the responsibility for your cloud infrastructure will once again fall squarely on your shoulders as in the good old on-prem server days.

In many cases, adopting a multi-cloud strategy that combines the strengths of both public and private clouds proves to be the optimal solution. Nonetheless, it is important to note that this approach may deviate from the principle of keeping things simple (KISS).