Confronting the Environmental Costs of Large Language Models
As the world grapples with the existential threat of climate change, the emergence of generative artificial intelligence (AI) has ignited both excitement and concern. While proponents laud the potential of these powerful technologies to revolutionize industries and unlock new frontiers, a growing chorus of critics are sounding the alarm on the environmental toll exacted by large language models (LLMs) and other resource-intensive AI systems.
Through a comprehensive review of the available evidence, this article argues that the machine learning community must fundamentally reframe the scope of its research and development to prioritize carbon emissions, water usage, and other environmental factors across the entire AI lifecycle. The hype surrounding generative AI has largely overlooked these crucial considerations, leaving the field vulnerable to the same pitfalls that have plagued the cryptocurrency industry.
Lessons from the Crypto Reckoning
Recall the frenzy of interest in blockchain technologies, from Bitcoin to metaverse applications. While proponents extolled these as paradigm-shifting innovations, critics quickly raised concerns about their outsized carbon footprint. As enthusiasm for crypto-based assets like non-fungible tokens (NFTs) accelerated, a vocal contingent took a stand against their environmental impact, revealing a moral hierarchy of energy uses.
Even as the crypto market has cooled, the hardware and infrastructures associated with cryptocurrency mining enterprises remain linked to rising emissions (Neumueller, 2022). This indicates that short-sighted, venture capital-backed speculative investments can have long-term repercussions for the planet.
Notably, the same level of dialog about the ecological impact of computational approaches is not present in our collective embrace of generative AI, despite striking similarities in their environmental toll. While researchers have consistently called out the various impacts of generative AI, and some policy measures have been introduced to regulate its growing footprint, there is still a prevailing hope that AI itself may offer climate solutions (Warso & Shrishak, 2024). Critics argue, however, that these policies rely too heavily on voluntary compliance and fall short of the meaningful change required to make AI less of an environmental threat (Crawford, 2024).
The Hidden Costs of Generative AI
Large language models (LLMs) have ignited widespread interest in AI due to their versatile capabilities across a range of applications. However, their production and deployment come at a significant cost in terms of carbon dioxide emissions and reliance on other vital resources, including water and land.
LLMs are not unique in their environmental impact; all information and communication technologies (ICTs) leave an indelible mark on the planet, from the extraction of metals for hardware to the water consumed by data centers and the electricity required to power an increasingly computerized world. But like cryptocurrency, LLMs are so computationally intensive that they accelerate the depletion of resources at a critical time.
Rather than considering carbon impact as part of the frontier of machine learning innovation, it is often set aside or treated as out-of-scope. For instance, prominent AI figure Geoffrey Hinton recently argued that AI poses a more pressing existential threat to humanity than climate change, in part because “we know how to solve the problem of climate change” (Coulter, 2023). This perspective defines the research agenda as orthogonal to the very climate impacts that make AI a source of risk to our collective survival (Kneese, 2024).
Uncovering the Environmental Toll of LLMs
Researchers have made significant strides in quantifying the environmental costs of LLMs, though much remains to be understood. A key finding is that integrating LLMs into search engines may increase the carbon footprint of a single internet search by as much as fivefold (Stokel-Walker, 2023). When scaled across global usage, the impact on overall carbon emissions could be devastating.
The computational demands of LLMs are also reflected in their financial costs. A study by Strubell et al. (2020) found that training the Transformer-based BERT model on a GPU has a carbon impact “roughly equivalent to a trans-American flight,” with these costs estimated to rise by an order of magnitude or more across model tuning and retraining.
While some researchers argue that the implementation of best practices, such as selecting more efficient models, will eventually lead to a plateauing of the carbon emissions tied to LLM training (Patterson et al., 2022), these findings do not account for the broader environmental footprint connected to data center construction and the entire AI lifecycle. Furthermore, adherence to best practices often depends on organizational and social factors, including the priorities of developers, managers, and the C-suite.
Strategies for Sustainable and Equitable AI
The emerging field of “green AI” examines strategies for not only measuring but also mitigating the climate impacts of AI systems. These include:
Tracking emissions across the AI lifecycle
Researchers have called for a comprehensive examination of the lifecycle impacts of LLMs, including the emissions tied to the manufacturing of the equipment used (Luccioni et al., 2023). This holistic approach is crucial, as the true environmental impact of AI production and use is connected to the larger supply chains and poor working conditions of the entire ICT industry.
Implementing carbon-aware software
Developers can ensure that machine learning training occurs at times and in regions where renewable energy is more readily available on the grid, considering the carbon intensity of the power sources (Dodge et al., 2022). However, developers do not always have full control over their working conditions, and managers may not prioritize green AI practices.
Balancing carbon and water costs
Along with energy, LLMs require massive amounts of water (Li, Yang, Islam, & Ren, 2023). Optimizing for reduced carbon emissions may inadvertently exacerbate water usage. Researchers call for a more holistic approach that considers both the carbon and water footprints of AI models, with a focus on ensuring environmental equity across different geographic regions.
Auditing systems for environmental justice impact
Environmental justice and equity should be central to assessing the sociotechnical and environmental impacts of AI. Researchers argue for the involvement of civil society actors, grassroots movements, and local communities to understand how algorithmic systems disrupt material resource flows and affect livelihoods and ecosystem resilience (Rakova & Dobbe, 2023).
Beyond Hype and Speculation
Much of the current discourse around generative AI focuses on speculative futures, either by foregrounding potential existential risks or potential sites of financial investment. However, with an eye toward the recent rise and fall of crypto, technologists, researchers, and advocates must look to the very real and already existing climate impacts of LLMs and other AI technologies.
The hype around generative AI must be tempered by a sober reckoning with its environmental costs. Just as the crypto industry was forced to confront its outsized carbon footprint, the machine learning community must incorporate carbon emissions, water usage, and other environmental factors as core design considerations. Responsibility in this regard lies with developers, designers, researchers, advocates, and policymakers to weigh the true costs and benefits of generative AI and to fully understand its repercussions for the planet and its people.
Conclusion: A Call to Action
The surge of interest in generative AI presents both opportunities and risks. As the field continues to evolve, it is crucial that the machine learning community takes meaningful steps to address the environmental toll of large language models and other resource-intensive AI systems. This requires a fundamental shift in the scope of research and development, moving beyond narrow performance metrics to prioritize carbon emissions, water usage, and environmental justice across the entire AI lifecycle.
By embracing a more holistic, sustainable, and equitable approach, the machine learning community can ensure that the promise of generative AI is not overshadowed by its ecological costs. The time for action is now, as the impacts of climate change grow increasingly severe. Only by confronting the environmental realities of LLMs can the field of AI truly fulfill its potential to benefit humanity and the planet.
References
Chien, A. C., Lin, L., Nguyen, H., Rao, V., Sharma, T., & Wijayawardana, R. (2023). Reducing the carbon impact of generative AI inference (today and in 2035). In G. Porter & T. Anderson (Eds.), HotCarbon ’23: Proceedings of the 2nd Workshop on Sustainable Computer Systems (Article 11). ACM. https://doi.org/10.1145/3604930.3605705
Coulter, M. (2023, May 8). AI pioneer says its threat to world may be more urgent than climate change. Reuters. https://www.reuters.com/technology/ai-pioneer-says-its-threat-world-may-be-more-urgent-than-climate-change-2023-05-05/
Crawford, K. (2024, February 20). Generative AI’s environmental costs are soaring – and mostly secret. Nature. https://www.nature.com/articles/d41586-024-00478-x
Dodge, J., Prewitt, T., Des Combes, R. T., Odmark, E., Schwartz, R., Strubell, E., Luccioni, A.S., Smith, N.A., DeCario, N., & Buchanan, W. (2022). Measuring the carbon intensity of AI in cloud instances. In FAccT ’22: Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency (pp. 1877–1894). ACM. https://doi.org/10.1145/3531146.3533234
Kneese, T. (2024, February 12). Measuring AI’s environmental impacts requires empirical research and standards. Tech Policy Press. https://www.techpolicy.press/measuring-ais-environmental-impacts-requires-empirical-research-and-standards/
Li, P., Yang, J., Islam, M. A., & Ren, S. (2023). Making AI less ‘thirsty’: Uncovering and addressing the secret water footprint of AI models. ArXiv. https://doi.org/10.48550/arXiv.2304.03271
Li, P., Yang, J., Wierman, A., & Ren, S. (2023). Towards environmentally equitable AI via geographical load balancing. ArXiv. https://doi.org/10.48550/arXiv.2307.05494
Luccioni, S., Jernite, Y., & Strubell, E. (2024). Power hungry processing: Watts driving the cost of AI deployment? In FAccT ’24: Proceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency (pp. 85–99). ACM. https://doi.org/10.1145/3630106.3658542
Luccioni, S., Viguier, S., & Ligozat, A. (2023). Estimating the carbon footprint of BLOOM, a 176B parameter language model. Journal of Machine Learning Research, 24(1), Article 253.
Neumueller, A. (2022, September 27). A deep dive into Bitcoin’s environmental impact. University of Cambridge Judge Business School. https://www.jbs.cam.ac.uk/2022/a-deep-dive-into-bitcoins-environmental-impact/
Patterson, D., Gonzalez, J., Hölzle, U., Le, Q., Liang, C., Munguia, L. M., Rothchild, D., So, D. R., Texier, M., & Dean, J. (2022). The carbon footprint of machine learning training will plateau, then shrink. Computer, 55(7), 18–28. http://doi.org/10.1109/MC.2022.3148714
Rakova, R., & Dobbe, R. (2023). Algorithms as social-ecological-technological systems: An environmental justice lens on algorithmic audit. In FAccT ’23: Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency (p. 491). ACM. https://doi.org/10.1145/3593013.3594014
Stokel-Walker, C. (2023, February 18). The generative AI race has a dirty secret. Wired. https://www.wired.com/story/the-generative-ai-search-race-has-a-dirty-secret/
Strubell, E., Ganesh, A., & McCallum, A. (2020). Energy and policy considerations for modern deep learning research. Proceedings of the AAAI Conference on Artificial Intelligence, 34(9), 13693–13696. https://doi.org/10.1609/aaai.v34i09.7123
Warso, Z., & Shrishrak, K. (2024, May 21). Hope: The AI Act’s approach to address the environmental impact of AI. Tech Policy Press. https://www.techpolicy.press/hope-the-ai-acts-approach-to-address-the-environmental-impact-of-ai/