
Artificial Intelligence Becomes a Major Consumer of Energy and Water: How the Growth of Neural Networks Affects Climate, and What Risks and Opportunities It Creates for Investors and the Global Economy
The Rapid Growth of AI and Energy Appetite
The demand for computing power for AI has skyrocketed in recent years. Since the launch of public neural networks like ChatGPT in late 2022, businesses worldwide have accelerated the integration of AI models, necessitating vast amounts of data processing. Industry estimates suggest that by 2024, AI could account for around 15–20% of total energy consumption in data centers globally. The power required to run AI systems may reach 23 GW by 2025—comparable to the total electricity consumption of a country like the United Kingdom. For comparison, this figure surpasses the energy consumption of the entire Bitcoin mining network, indicating that AI has become one of the most energy-intensive types of computation.
This exponential dynamic is driven by large investments from tech companies in infrastructure: new data centers are opened practically every week, and specialized chips for machine learning are launched every few months. The expansion of such infrastructure directly leads to increased electricity consumption, necessary to power and cool thousands of servers that support modern neural networks.
Emissions at the Level of a Metropolis
Such high energy consumption inevitably leads to significant greenhouse gas emissions if part of the energy is sourced from fossil fuels. According to a recent study, AI could be responsible for 32–80 million metric tons of CO2 emissions annually by 2025. This effectively elevates the carbon footprint of AI to the level of an entire city; for example, New York’s annual emissions are approximately 50 million tons of CO2. For the first time, a technology that seemed purely digital demonstrates a climate impact similar to that of large industrial sectors.
It is important to note that these estimates are considered conservative. They mainly account for emissions from electricity production for server operations, while the full lifecycle of AI—from equipment (servers, chips) manufacture to disposal—produces additional carbon footprints. If the AI boom continues at its current pace, the volume of associated emissions will swiftly rise. This complicates global efforts to reduce greenhouse gases and poses a challenge for tech companies—how to incorporate the explosive growth of AI into their commitments to achieve carbon neutrality.
The Water Footprint of Neural Networks
Another hidden resource appetite of AI is water. Data centers consume vast amounts of water for cooling servers and equipment; evaporative cooling and air conditioning rely heavily on water resources. Aside from direct consumption, substantial volumes of water are required indirectly—at power plants for cooling turbines and reactors during the generation of the very electricity consumed by computing clusters. Experts estimate that AI systems alone could consume between 312 to 765 billion liters of water by 2025. This volume is comparable to the total amount of bottled water consumed by humanity in a year. As such, neural networks generate an enormous water footprint that was largely unnoticed by the general public until recently.
Official estimates often do not reflect the full picture. For example, the International Energy Agency cited a figure of approximately 560 billion liters of water consumed by all data centers worldwide in 2023; however, this statistic did not include water used at power plants. The real water footprint of AI could be several times higher than formal estimates. Major industry players have been slow to disclose details: a recent report from Google regarding its AI system explicitly stated that it does not account for water consumption at third-party power plants. This approach is criticized, as a significant portion of water is indeed used to meet the electrical needs of AI.
Already, the scale of water consumption is raising concerns in several regions. In arid areas of the U.S. and Europe, communities are opposing the construction of new data centers, fearing that they will extract scarce water from local sources. Corporations themselves are noting the growing “thirst” of their server farms: for instance, Microsoft reported that global water consumption by its data centers rose by 34% (to 6.4 billion liters) in 2022, significantly due to increased loads related to AI model training. These facts emphasize that the water factor is rapidly coming to the forefront in evaluating the environmental risks of digital infrastructure.
Lack of Transparency Among Tech Giants
Paradoxically, despite the scale of AI's impact, there is very little publicly available data on its energy and water consumption. Large technology companies (Big Tech) typically report aggregated figures regarding emissions and resource use in their sustainability reports without separately disclosing the portion associated with AI. Detailed information about data center operations—such as how much energy or water is specifically used for neural network computations—often remains within companies. There is a virtual absence of information regarding “indirect” consumption, for instance, about water used in electricity production for data centers.
As a result, researchers and analysts must act like detectives, piecing together the picture from fragmentary data: snippets from corporate presentations, estimates of the number of AI server chips sold, data from energy companies, and other indirect indicators. Such opacity complicates understanding the full scale of AI's environmental footprint. Experts are calling for the introduction of strict disclosure standards: companies should report on the energy consumption and water usage of their data centers, broken down by key areas, including AI. Such transparency would enable society and investors to objectively assess the impact of new technologies and encourage the industry to seek ways to reduce environmental burdens.
Impending Environmental Risks
If current trends persist, the growing “appetite” of AI may exacerbate existing environmental problems. Additional tens of millions of tonnes of greenhouse gas emissions annually will complicate the achievement of the climate goals set forth in the Paris Agreement. The consumption of hundreds of billions of liters of freshwater will occur amid a global water scarcity that is projected to reach 56% by 2030. In other words, without sustainable development measures, the expansion of AI risks conflicting with the ecological limits of the planet.
If no changes are made, such trends could lead to the following negative consequences:
- Accelerated global warming due to increased greenhouse gas emissions.
- Exacerbation of freshwater shortages in already arid regions.
- Increased pressure on energy systems and socio-environmental conflicts surrounding limited resources.
Local communities and authorities are already beginning to respond to these challenges. In some countries, restrictions are being imposed on the construction of “energy-hungry” data centers, requiring the use of water recycling systems or the purchase of renewable energy. Experts note that without radical changes, the AI industry risks transitioning from a purely digital realm to a source of very tangible environmental crises—ranging from droughts to disruptions in climate plans.
Investor Perspective: The ESG Factor
The ecological aspects of the rapid development of AI are becoming increasingly important for investors. In an era when ESG (environmental, social, and governance factors) principles are coming to the forefront, the carbon and water footprints of technologies directly impact company valuations. Investors are questioning whether a “green” shift in policies could lead to increased costs for companies investing in AI. For example, stricter carbon regulations or the introduction of water use fees may raise expenses for those companies whose neural network services consume large amounts of energy and water.
On the other hand, companies that invest now in mitigating the environmental impact of AI may gain a competitive advantage. Transitioning data centers to renewable energy, improving chips and software for greater energy efficiency, and implementing water reuse systems reduce risks and enhance reputation. The market highly values progress in sustainability: investors worldwide are increasingly including ecological metrics in their business valuation models. Thus, for technology leaders, the critical question is how to continue scaling AI capabilities while simultaneously meeting societal expectations for sustainability. Those who find a balance between innovation and responsible resource management will win in the long term—both in terms of image and business value.
The Path to Sustainable AI
Despite the magnitude of the problem, the industry has opportunities to steer the growth of AI towards sustainable development. Global tech firms and researchers are already working on solutions capable of reducing AI's environmental footprint without stifling innovation. Key strategies include:
- Improving energy efficiency of models and equipment. Developing optimized algorithms and specialized chips (ASICs, TPUs, etc.) that perform machine learning tasks with lower energy consumption.
- Transitioning to clean energy sources. Utilizing electricity from renewable resources (solar, wind, hydro, and nuclear power) to power data centers, thereby reducing carbon emissions from AI operations to zero. Many IT giants are already entering into “green” contracts to procure clean energy for their needs.
- Reducing and recycling water consumption. Implementing new cooling systems (liquid, immersion) that require significantly less water, as well as reusing technical water. Selecting sites for data centers based on water availability: prioritizing regions with cooler climates or adequate water supplies. Research shows that a judicious choice of location and cooling technologies can reduce the water and carbon footprints of data centers by 70–85%.
- Transparency and accounting. Introducing mandatory monitoring and disclosure of energy consumption and water usage by AI infrastructure. Public accountability encourages companies to manage resources more effectively and enables investors to track progress in reducing ecological impact.
- Using AI for resource management. Paradoxically, AI itself can help solve this problem. Machine learning algorithms are already being used to optimize cooling in data centers, predict loads, and distribute tasks to minimize peak loads on networks and improve server utilization efficiency.
The next few years will be crucial for integrating sustainability principles into the core of the rapidly growing AI sector. The industry stands at a crossroads: it can either proceed by inertia and risk facing environmental barriers, or turn the problem into an impetus for new technologies and business models. If transparency, innovation, and responsible resource management become integral to AI strategies, the “digital mind” can develop hand in hand with planetary care. This balance is what investors and society as a whole expect from this new technological era.