
Courtesy of Microsoft
Inside a Chicago-area Microsoft Data Center. The Center's construction required 2,400 tons of copper, 3,400 tons of steel, 26,000 cubic yards of concrete, and 190 miles of conduit (CNET).
ChatGPT is an AI assistant chatbot, but to many, it has become an essential tool and even a friend. ChatGPT is an artificial intelligence created by OpenAI in 2022 that uses natural language processing to engage in human-like dialogue. It is generative, which means that it formulates responses to prompts. This is what the GPT stands for – Generative Pretrained Transformer (TechTarget.com). This means that the code program is given a huge data set and uses its algorithm to locate patterns within it in order to produce a complex response (like a paragraph or video), often using its memory of data patterns to predict the next word in a sentence (like autocomplete in Google) or a more appealing response to a particular user. Users train the chatbot by giving positive or negative feedback (thumbs up/thumbs down buttons) to increase the frequency of “good” responses, or ones that receive a thumbs up.
However, these complex algorithms and data sets don’t come out of thin air. They have to exist, physically, in order to be stored and used, just as the device you are physically holding right now is processing and storing data and algorithms. ChatGPT is stored in huge computers called servers in massive facilities called data centers. Though different data centers are powered from different sources (Coal, Natural Gas, Hydroelectricity, etc), they all require large amounts of energy, natural resources, and space, which makes them extremely economically and environmentally expensive.
As the models improve, they use more and more electricity and resources, which translates to a hefty impact on carbon emissions. According to Boston University associate professor of computer science Kate Saenko via The Conversation, “creating the much larger GPT-3, which has 175 billion parameters, consumed 1,287 megawatt hours of electricity and generated 552 tons of carbon dioxide equivalent, the equivalent of 123 gasoline-powered passenger vehicles driven for one year.” The centers also require a lot of water for cooling, which is especially worrisome for the many data centers located in drought-prone areas. “A mid-sized data center consumes around 300,000 gallons of water a day, or about as much as 1,000 U.S. households,” said Shehabi of Lawrence Berkeley National Laboratory to NPR.
This is a huge concern to Mr. Johnson, the Stone Ridge Technology teacher and specialist. “Water is probably the cheapest coolant solution, but I mean […] In my former career, I coordinated all the technology from DC public schools. One of the things that we learned is that you’ve never put your data or your network operating set close to water because water and electronics don’t mix. So that’s bad science right there.” He asserts that there are other accessible alternatives to cool data centers. “We have enough science out there to be able to cool with things…that can recirculate that are not water. The radiator in a car uses Ethylene glycol to cool the car. So we know how to do this.”
Though, according to techtarget.com, the knowledge cutoff for ChatGPT is currently late 2023, OpenAI is constantly updating its models and developing new advancements. Significantly, AI technology is being introduced into search engines and integrated into high-traffic areas like shopping, banking, and recently, even the US Government. In most cases, the larger the AI model, the bigger the carbon footprint it has, so each update, even each query, worsens the impact. According to The Association of Foreign Press Correspondents, cloud computing, or using these computing services over the internet (Microsoft), “currently accounts for about 0.5% of the world’s total energy consumption, a figure expected to exceed 2% in the coming years.”

With such rapid expansion, and at such high environmental cost, Mr. Johnson worries, too, that false or outdated information generated by AI poses another threat. “People aren’t looking at credible resources and verifying sources and getting back to science. [We need to be] getting back to those things that are scientifically proven over time.”
However, there are ways to develop more environmentally friendly models and greener alternatives that may meet your AI needs. BLOOM, an open-access AI model of a similar size to ChatGPT-3, but starkly contrasting its footprint, consumed 433 MWh of electricity in generating 30 tons of CO2eq, stated Saenko. This means that more earth-friendly and efficient modeling is possible with the same amount of data. Though choosing a greener AI bot may not have a big impact on an individual’s search, it does have a major effect on the environment. You can find some alternatives here.
It is also crucial to continue improving AI models that already exist. Many AI companies do not reveal the sources powering or cooling their data centers, so societal pressure promoting transparency could encourage more reliance on clean energy. “By bringing the computation to where green energy is more abundant, or scheduling computation for times of day when renewable energy is more available,” suggests Saenko, “emissions can be reduced by a factor of 30 to 40, compared to using a grid dominated by fossil fuels.”
There are many benefits and limitations to AI like ChatGPT, but one of the most severe and urgent drawbacks is its sizable impact on the environment. As AI is becoming progressively ingrained in business and daily life, the need for greener models and renewable resource-powered data centers becomes increasingly pressing. In the meantime, “I really want everybody to be wise consumers of this technology,” begs Mr. Johnson.