Companies’ long-established dependence on public clouds has recently declined due to spiralling costs, security problems and outages. Businesses across sectors and geographies are reconsidering private dedicated infrastructure to host their key applications

In the latest instalment of our Cloud for Business article of the week series, where we highlight key themes from the Raconteur report distributed in The Sunday Times, we take a look at why businesses may be turning their backs on public cloud and bringing data and applications back in-house.


For well over a decade, businesses of all sizes have steadily placed their digital systems onto platforms run by large cloud providers. Everything from CRM and data storage, to new applications and analytics, have ended up on the servers of major cloud providers, which are shared between multiple businesses.

While executives often have strong expectations about saving money and ensuring elastic scalability, the realities of using public cloud services can be more complex. In many cases, businesses endure spiking costs, unexpected add-on expenses, security problems and little influence over the technology being used. A recent outage at one of the world’s largest cloud providers saw thousands of businesses’ core email and collaboration systems go offline for several hours.

The buzz around public cloud provision has been really big, and to some degree that’s understandable given its scalability. But businesses rarely understand what they are actually getting into,” explains Jake Madders, co-founder and director of managed hosting provider, Hyve.

Typically, migrations to the public cloud are kicked off by well-intentioned developers, who see its capacity to support important and growing applications without having to buy new servers. But prices are rising as the number of providers in the market dwindles. Companies also experience ‘bill shock’ when they have surges in network traffic, which can happen for reasons as diverse as a successful marketing campaign and a DDoS  security attack.

There is another hidden cost. “Typically, businesses moving to the public cloud quickly find they need consultancies to help them manage the technology on a daily basis, given the complexity of choices in front of them at every stageThese consultancy costs often equal or exceed the actual cloud provision costs and these challenges have prompted many businesses to rethink their approach. There has been a strong trend towards cloud repatriation, which means pulling data and apps out of the public cloud, and back into controlled, secure private cloud setups,” explains Madders.

Private cloud gives businesses their own managed, dedicated servers, stronger security, better disaster recovery and backup options, consistent support and much more control over costs and technology choices.  Businesses are choosing and running powerful, cost-effective and secure private clouds that meet local needs, including around low latency, data sovereignty and on-the-ground support.

Choosing servers, processing power and a myriad other options is incredibly complex, particularly from a cost and performance perspective. As businesses look to retake control of their cloud infrastructure, moving away from a reliance on a single provider, they are turning to the private cloud. Doing so empowers them in mission-critical areas – including high-performance computing and artificial intelligence – massively increasing their business capabilities and efficiencies.”

M247 thoughts: 

Businesses looking at the headline costs of public cloud might understandably be weighing up the financial benefits of repatriation – not just to private cloud, but back to private servers. The cost of networking, storage and compute hardware has dropped over the past decade, and in certain cases it will make economic sense to move workloads back to traditional data centres.

But businesses also need to consider why their public cloud costs might be so high. Chances are, there are some inefficiencies that can be remedied.

At the height of the pandemic, many businesses were forced into lightning-quick cloud migration. They took a lift-and-shift approach, and had little time to refactor workloads, applications, data and systems for the cloud.

Consequently, they haven’t been able to take advantage of the native capabilities of the public cloud that would have driven operational and cost efficiencies. Auto-scaling, security and storage management functionality has been largely inefficient for many businesses, and they have ended up paying extra for the cloud provider to manage the complexity.

Retrospective refactoring can be time-consuming and costly though, and it may make sense for some of these businesses to repatriate some of their workloads. If certain workloads and data just need to be stored – untouched – for a period of time, repatriation to physical servers may make sense. But if it’s to be processed for business intelligence, it may make sense to keep it in the cloud and repatriate to a private setup to harness the advanced insights and analytics that can support growth-driving initiatives.

It’s also worth noting that, in the intervening years between migration and now, some workloads will have become public-cloud dependent, with specialised and advanced IT services involved. In these cases, repatriation to private cloud or on-premise hardware could prove a costly and complex process, or even impossible.

Businesses will need to pay careful consideration to all these factors when considering whether (and which type of) repatriation makes the best sense for them.

To download your complete copy of 2023 Cloud for Business report and read more articles like this, click here

More news

Sales: 0808 253 6500

Support: 0161 822 2580

Email us

To find out how our technology can transform your business get in touch