The Energy Ethics Of Using ChatGPT

December 17, 2024

One of the less popular conversations about the ethics of using Artificial Intelligence (AI) and generative AI platforms like ChatGPT concerns the energy needed to power these tools.

While we’re able to use AI to push the boundaries of creativity and innovation truly, there are major tradeoffs in the process. Experts predict AI will consume 3.5% of global energy by 2030, which leaves users with an interesting conundrum: Can the efficiency gains ever justify the environmental impact? 

ChatGPT surpassed 180 million users in just 18 months. With a wide array of account holders, the platform is used for a myriad of reasons. From generating a travel itinerary to ghostwriting a screenplay, this tool delivers results within a few clicks. 

These prompts and responses come at a cost. 

When ChatGPT searches the internet to deliver results, it outstrips Google’s energy consumption rate by 10 times. Large amounts of water are used to keep the servers responsible for this process cool. Every “conversation” (20-50 prompts) requires approximately half a liter of water. Multiply half a liter by 180 million users, and it starts to paint a worrisome picture of just how taxing these tools can be on natural resources. 

The Impact of AI Model Training 

While the amount of energy used for prompts and queries gives cause for concern, it’s not the main issue. In order to build a generative AI platform, a training model is used to build the algorithms that enable contextually accurate responses. 

Training large language models, like Generative Pre-Trained Transformers (GPT), requires supercomputers. These computers process huge amounts of text data from the internet at hyperspeed. According to research conducted by Stanford University, the GPT3 required 1,287 Megawatt hours of electricity. 

The impact on resources doesn’t stop there. 

As developers at OpenAI work to retain their popularity and dominance in the industry, new iterations of technology require more energy. Gartner took a closer look at this and reported that by 2030, AI will consume as much energy as France. 

This has a direct effect on climate change, with the massive energy consumption increasing greenhouse gas emissions. Looking at the U.S., 16% of the energy powering AI servers are still coal-reliant. GPT3’s language training generated 502 tonnes of carbon dioxide, the equivalent of 50 Belgian cars driving to the moon and back. According to Gartner, “The negative impact of AI needs to be mitigated. Business leaders should be aware of the growing environmental impact of AI and take action.”

Experts urge AI companies to be mindful of their resources and opt for wind, solar, or nuclear-powered energy. Google has emerged as a leader in this regard, running its operations solely on renewable energy since 2015. They’ve also highlighted the importance of finding alternative cooling solutions to minimize the digital water consumption created by AI data servers. 

Fluctuating carbon dioxide emissions

AI is a relatively new technology, and the industry has grown significantly over the past five years. Looking at generative AI specifically, it’s still early days. Microsoft’s Copilot and X’s Grok were only released in 2023, while the leader in text-to-image AI, Dall-E, was first introduced in 2021. Despite the fact that the industry is still in its infancy, the data centers for training these tools already account for 1% of global energy consumption

The location of these companies’ servers and the country’s energy source make a difference in their carbon footprints. 

One school of thought takes into consideration the fact that different countries, and even regions within a country, have different environmental factors to consider. They advocate for geographical load sharing. Essentially, this requires developers to distribute the process of AI training by identifying environmentally distressed areas and allocating training to centers that are more energy friendly. 

This would look like placing servers in areas that are not water distressed, and remotely deploying training to areas where renewable energy sources are already being used. 

Another innovative solution, “following the sun,” has gained traction in the industry. Instead of using one location to train AI, digital companies are opting to swap locations throughout the day based on the patterns of sunrise and sunset. This allows solar energy to be used effectively and reduces the environmental impact of AI training. 

Bloom, an AI developed as an open-source project, has demonstrated this. Bloom’s technology is comparable to ChatGPT, yet the training process generated approximately 25 tonnes of carbon dioxide despite relying largely on carbon-free nuclear energy. GPT3’s carbon footprint is estimated to be 20 times higher – that’s approximately 300 return trips between Paris and New York by plane. 

AIs supporting the planet

But it’s not all doom and gloom where AI and the environment are concerned. Artificial Intelligence holds powerful insights that can be used to combat climate change. Meteorologists find AI helpful in predicting natural disasters and extreme weather events, and it can be used to optimize carbon-heavy industries to reduce emissions. 

As technology companies race toward AI dominance in the market, public disclosure and transparency will be the challenge. The policy framework doesn’t yet exist to hold the industry accountable, and sourcing information on where AI models are trained, and the carbon footprint they generate is at the discretion of each developer. The hope right now is that users will be able to show discernment on when to turn to AI rather than generating hundreds of prompts for basic queries. 

Conclusion

In conclusion, while the energy consumption and environmental impact of AI, especially models like ChatGPT, are undeniable, it is crucial to balance these concerns with the transformative potential of these technologies. AI can play a pivotal role in addressing climate change, from improving the accuracy of weather forecasts to optimizing energy-efficient industrial practices.

However, its environmental footprint must be considered in its development and usage. By shifting to cleaner energy sources, optimizing data center locations, and increasing transparency in AI’s carbon costs, some of these impacts can be mitigated.

Ultimately, the responsibility falls on both developers and users to make informed decisions, ensuring that the benefits of AI do not come at the expense of the planet’s health. As the demand for AI grows, so must the commitment to sustainability in its deployment.

Subscribe to our weekly news digest.

Join now and become a part of our fast-growing community.

Invalid Email Address
Thanks for Subscribing!
We'll be sending you our best soon!
Something went wrong, please try again later