AI and the demand for cloud computing: Tipping the scale towards a more sustainable future
Tue 6 Jun 2023
In this opinion piece, Matt Hawkins discusses the environmental impact of artificial intelligence (AI) and the potential of sustainable cloud computing to offer a more environmentally friendly future. Matt explores the burgeoning problem of energy-intensive data centres that cater to AI demands and suggests potential solutions, such as distributed cloud computing, to address these sustainability challenges.
– – – – – –
We are living in a world dominated by technology. Machine learning and artificial intelligence’s rapid evolution – particularly generative AI – has opened up a world of possibilities for developers, such as leveraging open-source large language models to create AI-driven tools.
But this progress has come at a cost. Soaring demand for AI has seen construction of energy-intensive data centres soar. And this is causing untold environmental damage.
A Gold Mine – But At What Cost?
Oil and data are not dissimilar. You mine and refine both – and are left with a highly lucrative commodity. AI’s market value is estimated to be almost $100bn, with forecasts predicting this will grow twentyfold by 2030.
Mirrored values, however, aren’t the only trait they share. Pursuit of both often leads to a crippling impact on the earth. The need for unrivalled computing power – not to mention the storing, processing, moving, and managing of data – has led to the growth of vast data centres. And these are driving the conversation around how sustainable it can be for our future.
Substantial amounts of electricity are required to power these data centres, with larger facilities consuming the equivalent of a medium-sized town. The UK currently hosts 75 data centres – with 11 more under construction. These could account for over a quarter of the nation’s electricity output by 2029.
Thirsty Work for AI
OpenAI’s ChatGPT – the foremost generative AI tool – has dominated headlines courtesy of its ability to respond to inputs with human-like language. However, the chatbot consumes a staggering amount of water in the process. A conversation of between 20-50 questions could see the chatbot ‘drink’ up to a 500ml bottle of water.
Energy-intensive air conditioning units ensure data centres operate at optimal capacity. But irrespective of whether air is blown throughout the building or different floor layouts are used, the result remains the same: significant energy investment is a must.
Microsoft’s ChatGPT-3 training period may have consumed 700,000 litres alone – enough to produce 370 BMW cars. Whatsmore, GPT-4, the latest iteration, requires even more computing power and even more water. Staying with the car analogy, training new AI models can emit more than 283 tonnes of carbon dioxide. This equates to almost five times the lifetime emissions of the average American car, including manufacturing.
The status quo isn’t sustainable, either long-term or in the immediate future. Sourcing eco-friendly alternatives for cloud computing to mitigate data centres’ massive environmental impact and widespread adoption of AI should be top-of-mind.
The Role of Sustainable Cloud Computing in Reducing AI’s Environmental Impact
Distributed cloud computing as a solution
Technology is the great enabler when it comes to sourcing a sustainable solution.
Sustainable cloud computing technology – specifically the distributed variant – offers an opportunity to address climate challenges whilst simultaneously enhancing efficiency. The ability to monetise spare cloud resources through distributed cloud computing can also be a powerful cost-cutting measure. The global cloud computing market was valued at $483bn in 2022, and is expected to expand at a compound annual growth rate of 14.1% from 2023 to 2030.
Think of distributed cloud computing as the Airbnb of the tech landscape. The similarities around the shared economic model allow individuals and organisations to capitalise on others’ unused computing resources, promoting efficient resource utilisation as well as generating additional revenue.
Spreading AI workloads across multiple data centres via distributed cloud computing services can be the stepping stone to advancing our sustainable efforts. Reducing individual facilities’ energy consumption and minimising the need for physical infrastructure is an effective counter to the sustainability issues plaguing the current centralised cloud market.
Amazon Web Services (AWS), Google Cloud Platform (GCP) and Microsoft Azure are all hyperscale providers. Migrating appropriate workloads to a decentralised cloud that can leverage a vast network of devices otherwise sitting idle – ranging from the 8,000 data centres globally to gaming PCs and consoles in the future – can reduce the over reliance on individual providers whilst tackling the rising tide of e-waste. This can also mitigate the risk of disruptive and damaging outages.
Turn up the heat
The round-the-clock operation of data centres generates massive amounts of heat – traditionally considered a waste product. But innovative approaches to this problem can repurpose and recycle this heat to fuel other projects.
Exmouth Leisure Centre has implemented a miniature data centre and surrounded the unit with oil. Capturing excess heat and subsequently deploying it to heat a swimming pool for 60% of the time is a nuanced way to conserve energy.
Similarly, QScale – a data centre development firm in Canada – has partnered its data centres with greenhouses. This strategy, which enables the company to cultivate over 80,000 tonnes of tomatoes annually, demonstrates the potential for productive and sustainable utilisation of data centres.
Individually these initiatives would struggle to make a big impact in resolving AI’s exponential demand for a broader and deeper pool of compute power. However, as part of a distributed cloud framework these initiatives can be combined to address much larger projects, delivering environmental and commercial rewards for both the innovative supplier and the user.
Addressing generative AI’s substantial environmental impact isn’t a simple matter. It requires collaboration between AI researchers, data centre operators and policymakers. Engaging in open dialogue and cooperation can drive responsible practices and ultimately, support tech innovations with sustainability at the core.
About the Author
Matt Hawkins is a tech entrepreneur founding C4L in 2000, one of the fastest growing ISP’s in the UK & winner of many awards including Times Tech Track 100, Deloitte UK Top 50, Deloitte EMEA Fast 500 & many more. C4L was acquired in 2016, so Matt could start CUDO to make better use of the world’s computing capacity leading the company to top eight of Deloitte’s Tech Fast 50 2022.
If you have an opinion you’d like to share, please contact our Editor, Stuart Crowley, at [email protected].
Hungry for more tech news?
Sign up for your weekly tech briefings!