When you think about climate change and the global efforts to reduce greenhouse gas emissions, most of us can probably name the ‘usual suspect’ industries most often singled out for their detrimental impact.
Until recently, the role of technology in the worrying and ongoing warming of our planet has been largely overlooked. That’s because, despite the fact that there are billions of computer devices around the world all drawing on the mains electrical grid – and therefore contributing to fossil fuel consumption – this has largely just been classified with general domestic and business energy use.
With the rise of the independent data centre things have started to change. With their great banks of servers, routers and switches, plus their state-of-the-art cooling and security systems, data centres consume electricity on an industrial scale which sits very obviously apart from general domestic and business use. And this has led to questions about the potential impact they, and technology in general, are having on the environment at large.
As we’ve seen the world transition to a data economy, we’ve seen data centres become key hubs in a new kind of trade and exchange. From the growth of digital giants like Amazon, Google and Facebook to the emergence of the Internet of Things, plus the explosion in use of mobile internet and Cloud computing, data drives everything. TechUK states, “data centres enable and drive this new type of economy the way that heavy industry drove the manufacturing economies of the past.”
Yet just as there has been with heavy industry, there is an environmental price to pay with the growth of data centres, too. Back in 2016, it was reported that the world’s data centres consumed more energy than every home and business in the UK combined, and that data centre consumption was predicted to treble in the course of a decade. More recently, it has been suggested that data centres contribute 2 per cent of all carbon emissions, which puts it on a par with the much-maligned aviation industry.
The truth is, much like air travel, we are not going to turn back the clock on the digital revolution and revert back to a time before the internet, smartphones and cloud computing, however pressing the claims of climate change are. The world has long passed the event horizon in digital adoption, and it needs the infrastructure – with data centres at the heart of it – to support the use of technology. Moreover, all the predictions point to data consumption continuing to grow at a rate of knots, which also means more data centres, more banks of servers – and more electricity consumed.
What the technology industry cannot be of course, is complacent about the impact that data centres are having on climate change. In an ideal world, all electricity would be generated from renewable sources, which would solve the issue at a stroke. Yet despite claims from the likes of Apple and Google that they are already sourcing 100% of their energy requirements from renewable sources, the reality is that fossil fuels are likely to play a role in energy production for decades to come. The onus then falls on data centre operators and service providers – to find ways to make data centres significantly more energy efficient and thus reduce their carbon footprint.
Thankfully, this is a challenge already being taken up by the global technology community. Here are some of the ways the industry is working to make data centres greener and more sustainable, so everyone can stay online while prioritising how we protect our planet.
Carbon neutral cooling
As much as 40% of the energy consumed by a data centre goes into powering cooling systems. That is because servers, like most types of electrical equipment, generate heat. When you run hundreds or even thousands of servers in close proximity to one another, all of this combined heat becomes a significant fire risk that has to be mitigated somehow.
The traditional approach of installing fans and aircon-type evaporation temperature control systems creates a huge additional burden on energy consumption. One solution is to locate data centres in colder places where low ambient environmental temperatures will take more of the heat, so to speak. Well-known examples of this in action include the Hamina data centre in Finland and the Green Mountain facility in Norway, both of which also make use of cold sea water in their cooling systems. Microsoft has taken this concept a step further by experimenting with small server banks submerged under the sea off Orkney.
Of course, it would be impractical to locate all of the world’s data centres in colder latitudes just for the sake of cooling, particularly with some countries, including China and several African nations, passing laws that require private data to be stored in domestic data centres. Operators are therefore keen to find solutions that will enable cooling systems to run more efficiently even in the hottest locations. One promising possibility is building liquid cooling systems directly into servers – developers of a pioneering ‘wet’ server claim reductions in energy consumption for cooling of between 80% and 97% in any environment. Energy efficiency of servers can be further improved through choice of components and case design.
In an industry as hi-tech as data centres, it seems only right that one of the main solutions for making our IT infrastructure greener should be found in cutting edge technologies. One of the trends viewed as driving greater energy efficiency is the growth of so-called hyperscale data centres – truly massive facilities housing tens of thousands of servers. While the simple laws of consolidation and economies of scale offer some explanation as to why such out-sized centres are more efficient than running a number of separate individual facilities, the true potential of hyperscale comes from their sophisticated design and application of state-of-the-art technologies.
As this article on ZDNet explains, the concept of hyperscale is less about size than scalability. The reason the likes of Facebook, Amazon and Google have been able to pioneer truly massive data centre facilities is that they have taken a modular and automated approach to data centre architecture that borrows a lot from the principles of lean manufacturing. If you break down the way a data centre operates into discreet units, make each unit as efficient as possible and then piece each module together so resources are allocated evenly and equally, you can maximise overall yield while minimising waste.
The thinking is, then, that the development of hyperscale data centres will not just be about more and more huge facilities being built, but about an ultra-efficient, ultra-scalable design approach being applied to data centres of all sizes. This will be supported by the growth of Artificial Intelligence (AI) to deliver the next generation of intelligence-led automation. According to IDC, by 2022 AI will be embedded in around half of all data centre components, providing real-time dynamic control of power consumption, temperature and humidity as well as predictive maintenance and efficiency monitoring.
For operators like M247, reducing our carbon footprint is about keeping up to date with the latest technologies, the most efficient servers, the best low energy cooling solutions, the most cutting edge innovations in design and infrastructure, and making them available so our customers can enjoy first class IT and connectivity services without concerns over their impact on the environment.
M247 data centres
Data centres are set to be one of the biggest global energy consumers over the next 5-10 years, energy consumption and subsequent ways to reduce it are becoming increasingly important in today’s world. Data centre operators and designers are constantly innovating and adjusting systems to reduce energy consumption, this can be anything from free-cooling systems, higher efficiency UPS’s, turning off idle equipment and even the small things such as blanking plates or raising temperature setpoints.
Whilst keeping customers up and running is M247’s number one objective, we are also conscious of the impact this zero-downtime industry has on the environment. This is amplified in the summer months when temperatures are more extreme, causing air conditioning energy consumption to rise to keep the temperatures down in the data centres. M247 are proud to say our data centres have green credentials in both the UK and Romania, our Power Utilisation Effectiveness (PUE) is on average 1.4 and can be reduced further in the winter months when free cooling is available.
To put this in context “big operators, such as hyperscale cloud companies and big colos regularly claim annual or design PUE figures between 1.1 and 1.4. It is an industry success story — a response to both higher power prices and concerns about carbon dioxide emissions.” – https://journal.uptimeinstitute.com/is-pue-actually-going-up/ –
M247 consider themselves innovative data centre operators and we employ a wide array of energy saving measures to reduce our environmental impact. These measures can vary from (1) eco-cooling, which is an evaporate cooling solution that uses a water curtain to cool the air being pulled into the data centre (otherwise known as adiabatic cooling) (2) Chilled water with free-cooling, this systems utilises low ambient temperatures to cool the water without energy hungry compressors being needed. (3) Expert engineering teams who are trained in energy efficiency and adopt the best practices for cooling and power. These are just some of the measures we utilise and it means the requirement for air conditioning is reduced, thereby reducing energy consumption and costs. Data centre environments are actively monitored using advanced Building Monitoring Systems (BMS) to ensure that as default we utilise a mix of eco/free-cooling and air conditioning and if temperatures reach a point where performance may be impacted automated actions are taken to maintain optimal climate conditions.