Skip to Content

AI and Sustainability: Is There a Problem?

Dive into the natural tension between sustainability goals and AI innovation, and how flash storage can help you strike a balance.

Actions
4 Minuten Lesezeit

Introduction

By Patrick Smith, VP, EMEA Field CTO, Pure Storage

AI can do more and more. Think of any topic and an AI or genAI tool can effortlessly generate an image, video or text. Yet the environmental impact of, say, generating a video by AI is often forgotten. For example, generating one image by AI consumes about the same amount of power as charging your mobile phone. A relevant fact when you consider that more and more organizations are betting on AI.  

After all, training AI models requires huge amounts of data, and massive data centers are needed to store all this data. In fact, there are estimates that AI servers (in an average scenario) could consume in the range of 85 to 134Twh of power annually by 2027. This is equivalent to the total amount of energy consumed in the Netherlands in a year.

The message is clear: AI consumes a lot of energy and will, therefore, have a clear impact on the environment. 

Does AI Have a Sustainability Problem?

To create a useful AI model, a number of things are needed. These include training data, sufficient storage space and GPUs. Each component consumes energy, but GPUs consume by far the largest amount of power. According to researchers at OpenAI, the amount of computing power used has been doubling every 3.4 months since 2012. This is a huge increase that is likely to continue into the future, given the popularity of various AI applications. This increase in computing power is having an increasing impact on the environment.

Organizations wishing to incorporate an AI approach should therefore carefully weigh the added value of AI against its environmental impact; while it's unlikely a decision maker would put off a project or initiative, this is about having your cake and eating it. Looking at the bigger picture and picking technology which meets both AI and sustainability goals. In addition to this, the underlying infrastructure and the GPUs themselves need to become more energy-efficient. At its recent GTC user conference, NVIDIA highlighted exactly this, paving the way for more to be achieved with each GPU with greater efficiency.

Reducing the Impact of AI on the Environment

A number of industries are important during the process for training and deploying an AI model: The storage industry, data center industry, and semiconductor industry. To reduce AI's impact on the environment, steps need to be taken in each of these sectors to improve sustainability.

Die Effizienz von Pure Storage hilft Ihrem IT-Team und schützt die Umwelt

Unser intelligentes Produktdesign kann die Treibhausgas-Emissionen im Vergleich zu anderen Anbietern von All-Flash-Storage um bis zu 85 % senken.

ESG-Bericht lesen

The Storage Industry and the Role of Flash Storage

In the storage industry, concrete steps can be taken to reduce the environmental impact of AI. An example is all-flash storage solutions which are significantly more energy-efficient than traditional disk-based storage (HDD). In some cases, all-flash solutions can deliver a 69% reduction in energy consumption compared to HDD. Some vendors are even going beyond off-the-shelf SSDs and developing their own flash modules, allowing the array's software to communicate directly with flash storage. This makes it possible to maximize the capabilities of the flash and achieve even better performance, energy usage and efficiency, that is, data centers require less power, space and cooling.

Data Centers Power Efficiency

Data centers can take a sustainability leap with better, more efficient cooling techniques, and making use of renewable energy. Many organizations, including the EU, are looking at Power Usage Efficiency (PUE) as a metric -- how much power is going into a data center vs how much is used inside. While reducing the PUE is a good thing, it's a blunt and basic tool which doesn't account for, or reward, the efficiency of the tech installed within the data center.

Semiconductor Industry

The demand for energy is insatiable, not least because semiconductor manufacturers -- ,especially of the GPUs that form the basis of many AI systems -- are making their chips increasingly powerful. For instance, 25 years ago, a GPU contained one million transistors, was around 100mm² in size and did not use that much power. Today, GPUs just announced contain 208 billion transistors, and consume 1200W of power per GPU. The semiconductor industry needs to be more energy efficient. This is already happening, as highlighted at the recent NVIDIA GTC conference, with CEO Jensen Huang saying that due to the advancements in the chip manufacturing process, GPUs are actually doing more work and so are more efficient despite the increased power consumption.

Conclusion

It's been clear for years that AI consumes huge amounts of energy and therefore can have a negative environmental impact. The demand for more and more AI generated programmes, projects, videos and more will keep growing in the coming years. Organizations embarking on an AI initiative need to carefully measure the impact of their activities. Especially with increased scrutiny on emissions and ESG reporting, it's vital to understand the repercussions of energy consumption by AI in detail and mitigate wherever possible.

Initiatives such as moving to more energy efficient technology, including flash storage, or improving data center capabilities can reduce the impact. Every sector involved in AI can and should take concrete steps towards a more sustainable course. It is important to keep investing in the right areas to combat climate change!

Actions
4 Minuten Lesezeit

We Also Recommend

Ihr Browser wird nicht mehr unterstützt!

Ältere Browser stellen häufig ein Sicherheitsrisiko dar. Um die bestmögliche Erfahrung bei der Nutzung unserer Website zu ermöglichen, führen Sie bitte ein Update auf einen dieser aktuellen Browser durch.