Current AI technology requires an immense amount of natural resources to process, maintain, and sustain its high computational needs. Data centers support the training and maintenance of AI models through facility and IT infrastructure as well as access to powerful energy resources (hydro, nuclear, wind, etc.). These centers also require additional maintenance and assets, such as cooling towers or power grids. Due to how interconnected these environmental resources are to AI model investments, existing concerns over greenhouse gas (GHG) emissions, water usage, and land equity have grown in the greater discussion of this technology’s impact and ethical implications.
Environmental impact is difficult to report because each company, AI model, and version of that model can have their own specific energy demands. For example, in 2021, Google and UC Berkeley estimated that Chat GPT-3 produced 552 tons of carbon dioxide in their “Carbon Emissions and Large Neural Network Training” study. In Google’s 2024 Environmental Report, the company reported that their “…total GHG emissions increased by 13%” and in Google’s 2025 Environmental Report the company shared that investments into clean energy projects changed their “carbon-free energy use from 64% to 66% on an hourly basis” yet their emissions have risen by 51% since 2019 according to the Guardian article, “Google’s emissions up 51% as AI electricity demand derails efforts to go green” (June 2025).
Data center reports can give further information. The United States Environmental Protection Agency’s (EPA) GHG reporting program (GHGRP) requires a variety of facilities to provide these reports, although regulations and procedures of this program are undergoing review as of spring 2025. Nonetheless, there are a variety of research groups investigating these concerns. For example, a study between Harvard University and the University of Pisa found that “US data centers produced 105 million tons CO2e in the past year with a carbon intensity 48% higher than the national average” in their Environmental Burden of United States Data Centers in the Artificial Intelligence Era report (November 2024).
According to the Washington Post article “A bottle of water per email: the hidden environmental costs of using AI chatbots,” (September 2024) which references the research from the “Making AI Less “Thirsty”” article, using GPT4 once to generate an email requires 519 milliliters of water, roughly more than 1 bottle of water. Water usage also depends on a data center’s location. According to the article’s findings, using a GPT-4 AI chatbot to generate a 100-word email in Illinois is about 464 milliliters of water. The growing interest in data centers has also prompted a search for land that can sustain these facilities. For example, the United States Department of Energy (DOE) has explored federal sites to support AI initiatives (DOE April 2025 press release). Energy demands can have an impact on power grids and data centers, which can also impact local economies in higher billing rates and employment booms as explored in a January 2025 report from Axios San Francisco.
This guide only addresses a few of these natural resources, but it’s important to recognize there are other resources affected by this technology such as electricity and precious metals. Additionally, some industry leaders and researchers have shared that AI and further investment into more advanced equipment can be used to gradually address these concerns as noted by a United Nations Educational, Scientific and Cultural Organization (UNESCO) report (February 2024). Although these issues are not unique to this technology, the speed of deployment and demand partnered with the other ethical concerns have made environmental impact a leading point of discussion