Cloud repatriation came about as an answer to the challenging landscape of the current economy. IT managers save millions of dollars in subscription fees and other costs by moving their projects and data from the public cloud to private data centers.
However, while potentially profitable, it has a high cost of entry and a steep learning curve. In this article, we will explain an idea behind cloud repatriation, its pros and cons as well as our suggestions for how to best introduce it in your company.
Cloud repatriation is a strategic decision to migrate workloads from public clouds to private data centers. It is sometimes referred to as “de-clouding,” “unclouding,” or “reverse cloud migration.”
Cloud repatriation is one method of optimizing the costs of running a digital product. This solution also enhances security and control over your resources.
In 2024, organizations around the world are still looking for ways to cut costs and survive. Cloud repatriation became popular as a part of efforts to achieve that. Small and medium-sized businesses might find a cloud-only approach too expensive and ill-suited to their needs. They escape vendor lock-in and lead the cloud repatriation trend.
Below, we have listed some of the benefits that this move can bring to your business:
Cost efficiency
As of February 2024, there are only a few case studies of successful cloud repatriation stories. The most notable one is Dropbox. In 2016 they moved 90% of their data to a private infrastructure, which cost them 53 million dollars to build. In two years, this move brought them savings of nearly 75 million dollars.
Another notable example is Ahrefs, an SEO software company. In 2020 they moved some of their resources into a shared space within a data centre in Singapore. They estimate that by doing so, they saved $400 million over 3 years, as this amount they would normally need to pay for a comparable cloud space service.
Smaller businesses, such as Basecamp, estimate that de-clouding (their name for cloud repatriation) will save them 1.4 million dollars annually. Their subbrand, HEY, already conducted cloud repatriation in 2022 and saved this way 7 million dollars in operating costs.
Think of cloud services as property: owning your cloud infrastructure, much like owning a home, might be more cost-effective than renting through public cloud subscriptions.
Owning and maintaining your own cloud infrastructure might be cheaper than relying on public cloud services, but only in the long run.
The growing focus on solutions that can help these organizations save money and use their resources more effectively also impacted cloud providers like Google. They offer FinOps, a practice aimed at optimizing cloud costs and providing better financial management of cloud resources. This approach is helping businesses manage their budgets more efficiently while getting the services they need. Some companies go as far as to create separate teams or departments dedicated to increasing cost-effectiveness through technology.
Enhanced control and security
To mitigate the risk of data breaches, businesses are keen on taking security into their own hands. In fact, this is the second—after cost-effectiveness—incentive for them to conduct cloud repatriation.
The truth is, private data centers don’t come with a guarantee of improved security. They only provide better control over the protection of your data. This creates an opportunity to deliver higher standards of security than popular cloud providers, but the business has to have the tools and skills to do so.
Reduced vendor dependency
Whether the cloud is public or private, certain technologies, tools, or services are hard to replace. Some degree of vendor lock-in is to be expected.
When an organization decides to opt in for cloud repatriation, it gains more control over its data and infrastructure. The balance of power shifts in their direction as they are no longer locked-in with their cloud provider. This allows them to choose their own hardware and software and be free of long-term contracts that public clouds require.
Minimal changes to your operating model
Moving from public clouds to private data centers usually means adjusting your internal processes, systems, and technology. In other words, your operating model.
This creates an entry barrier, as software engineers need time to adjust to the new way of working. However, that’s true only for some businesses. The ones relying on Kubernetes will find that transition to be much easier.
Kubernetes serves as a stable platform for application deployment, whether it's in the public cloud or an on-premises data center. Its consistency across different environments reduces the learning curve for developers, streamlining the migration process and minimizing operational disruptions.
Companies that had their public clouds set up with a Kubernetes platform, can continue to use it after cloud repatriation. In this setting, Kubernetes provides:
- A common, unified abstraction layer.
- A consistent approach.
- A familiar operating model.
Other cost-related benefits of cloud repatriation
Financial savings from switching to a private cloud come from multiple areas. Below, we have listed factors that contribute to cloud repatriation’s cost-effectiveness:
- Predictable costs.
- Cost-optimized reliability.
- Low-cost hardware friendly.
- The selected offering of internally managed services with a known cost structure.
As of February 2024, cloud repatriation is considered a novelty. It might follow one of the patterns of tech hype, where some ideas explode in popularity and others fizzle out quietly.
The businesses that decide to go through with it have to face the possibility that in a few years, they might be left with an expensive data center infrastructure and very few specialists on the market who will know how to maintain it.
And even if the cloud repatriation trend catches on, there are other risk factors to consider, such as:
Drop in performance
Public clouds offer resources on a scale beyond what most private data centers are able to match. When deciding on cloud repatriation you need to accept lower latency and a drop in performance.
Attracting the right talent
Right now there are very few experts in the matter, as most of them are reshaping their skills toward modern cloud-native technologies. Recruiting them can add up to the initial investment in the on-site infrastructure. Unless cloud repatriation becomes mainstream, the adoption of this method comes with a risk of generating additional costs.
Complexity of cloud repatriation
Cloud repatriation is a complex process. Migrating the data, setting up the infrastructure, and maintaining data centers require know-how, time, and effort. A sensible course of action would be to start with an audit of your organization and see if you have sufficient resources and manpower.
Risk of data loss
While there are steps to ensure that no data will be lost during migrations, the scenario of that happening is still plausible.
Most companies mitigate that risk by hiring a technology partner with the right expertise. VirtusLab has a proven track record of successful data migration projects. Contact us to find out how we can help you.
High initial cost
Hiring new specialists, buying physical servers, and renting new space - it’s all adding up to the initial costs of conducting cloud repatriation. Cost savings will emerge over time; however, be prepared for substantial initial investments.
New responsibilities and challenges for the management
After a successful cloud repatriation, your management will face new responsibilities and challenges. Public cloud providers usually include things like supply chain, hardware, or incident management in their services. With a private data center, your business needs to develop strategies to effectively manage these functions in-house.
Also, some specialists might be skeptical about working with private cloud. Dealing with a potential decrease in developers’ satisfaction is another challenge that your management must prepare for.
The success of cloud repatriation comes down to careful planning and preparation. Our engineers categorize necessary tasks into high-level and low-level steps. Here's what businesses should consider before switching to private data centers:
High-level steps you should consider before conducting cloud repatriation
- To enable data-driven decision-making, your business needs to begin with a migration plan. It should outline your expectations, the system’s architecture, data flow, etc. The plan should include a timeline that sets realistic expectations for the entire process.
- Additionally, your plan should account for assembling a dedicated data migration team that will make sure your day-to-day business is running smoothly during migration.
- Unless your business is using the Kubernetes platform, you will need to prepare a new operating model. You want to avoid a situation when data repatriation forces a significant change in the way of working. This comes with a risk of creating confusion for your teams.
Low-level steps required for cloud repatriation
- Your business needs to analyze which teams would benefit from cloud repatriation the most and create a funnel based on your findings. It should take into account: the degree of dependency, team maturity, or if the team is customer-facing. The analysis should also help you establish the order in which the teams should move to private data centers.
- Prepare your business to effectively manage data centers. For example, physical servers require maintenance and technical know-how to run efficiently. It’s a completely different set of skills compared to running a cloud-only infrastructure.
The future landscape of the IT industry will decide if the cloud repatriation trend will catch on. Below we have listed 3 tendencies that we believe will be influential:
- There's a lot of buzz about AI right now. Companies are taking an interest in adopting it as widely as they can. Unfortunately, private data center’s infrastructure may face challenges in implementing AI.
For example, OpenAI's models and services operate on Azure's cloud infrastructure. It is optimized for development and access through cloud platforms rather than on-premises data centers.
As AI development continues to be cloud-centric, businesses that focus solely on a private cloud strategy may potentially be limiting their own growth and innovation potential.
- The recent development of technologies like Anthos, Azure Arc, and AWS Outposts is getting more traction. These solutions are where cloud providers aim to consolidate the market and bridge private and public clouds. These three solutions are currently competing for dominance and the future of cloud repatriation can largely depend on the outcome of that rivalry.
- Edge computing is becoming increasingly popular. Wider adoption of this paradigm can be in favor of bridging the gap between private and public clouds. Perhaps this could become the ideal approach: seamless private and public clouds with no distinction between the two.
In reality, it is unlikely that full-scale cloud repatriation is the answer. Gartner says it’s more of “an exception to the rule” and we agree with them. There are other ways for companies to cut their expenses on technology. For example, some businesses have started focusing on FinOps, a practice aimed at optimizing cloud costs and providing better financial management of cloud resources. Some companies go as far as to create separate teams or departments dedicated to increasing cost-effectiveness through technology.
Cost-effectiveness aside, the public cloud has its benefits too. For example, if Netflix decided to move to the private cloud, we wouldn’t be able to access every TV show instantly everywhere in the world, due to the distance from the physical servers.
The right answer depends on the unique characteristics of each business. Right now our view is that the hybrid approach offers the best of both worlds.
Cloud repatriation has a long way to go. In 2024 we could see it rise in popularity, but we might as well see it fall.
Curated by Krzysztof Radzik