What's actually true about AI data centers?
AI data centers use real electricity and water, but the useful debate depends on location, cooling, grid impact, waste heat, and who benefits.
I hear the same worries about AI data centers a lot now.
They use too much electricity. They use too much water. They strain the grid. They are bad for the planet. Sometimes the claims are exaggerated. Sometimes they are directionally right. Either way, the conversation often jumps from "this uses resources" to "we should stop it."
I think that skips the useful part.
AI data centers do use a lot of electricity. They can use water. They can strain local grids, trigger fights over utility rates, make noise, rely on diesel backup generators, and compete for land and infrastructure. Those are real issues.
They are also engineering, policy, and incentive problems. If AI is a major economic opportunity, and I think it is, then the useful response is not to wave the costs away or treat every cost as a reason to stop building. It is to ask better questions about location, power, cooling, water, grid upgrades, waste heat, public benefit, and who pays.
The debate gets better when it gets more specific.
The scale is big, but not in the way people usually imply
Globally, data centers are not yet eating the grid. The International Energy Agency estimates that data centers used about 1.5% of global electricity in 2024. By 2030, it projects that share could roughly double to around 3%.
The US is moving faster. A LBNL/DOE report estimated data centers used about 4.4% of US electricity in 2023, with scenarios reaching 6.7% to 12.0% by 2028.
Those are big numbers. But the global average still hides the more important problem: local concentration.
In Ireland, for example, the Commission for Regulation of Utilities says data centers went from 5% of electricity use in 2015 to 22% in 2024, with a forecast of 31% by 2034. That is a different story than 1.5% globally. The global numbers can hide the local strain.
Local context matters.
The water question is messier than it sounds
Cooling can use water, but not every data center uses water the same way.
Some use evaporative cooling, where water is consumed as part of the cooling process. Some use air cooling. Some use liquid cooling in closed loops. Some use reclaimed or non-potable water. Some use very little direct cooling water but still cause indirect water use through the power plants that generate their electricity.
So the cooling system matters, and not all systems consume the same amount of water.
Microsoft, for example, announced a next-generation data center design that uses chip-level liquid cooling and a closed-loop system. The company says the system consumes zero water for cooling once filled, though water is still used for normal building needs like restrooms and kitchens. Microsoft also says the design avoids more than 125 million litres of water per year per data center, based on its FY2024 global average water usage effectiveness of 0.30 L/kWh.
However, if a data center uses evaporative cooling in a water-stressed region, water may be a serious local issue. If it uses closed-loop cooling in a cool, water-rich region, direct cooling water may be much less important. If the electricity comes from thermoelectric power plants that use water, then indirect water still matters. If the facility uses reclaimed water, that is different again.
A litre in Quebec is not the same as a litre in Arizona. A withdrawal is not the same as consumption. Direct cooling water is not the same as water used somewhere else to generate electricity.
This is where a lot of online criticism goes wrong. It takes a real issue and strips away the local context that matters.
The economic upside is real
The tech industry sometimes talks about data centers as if they are pure abstraction: compute, cloud, AI capacity, digital infrastructure. Locally, they are very physical.
They require land, power, construction, electrical work, cooling systems, substations, fiber, security, maintenance, operations, and ongoing capital replacement. Many of the jobs are outside the usual AI story: construction crews, electricians, pipefitters, HVAC technicians, security staff, maintenance workers, project managers, and skilled trades. They can bring large construction projects, tax revenue, procurement, and a stronger base for local AI companies that need access to compute.
TD Economics notes that the US and China have captured most new global data center capacity and the economic gains that come with it. Its Canada-focused analysis points to Virginia as the mature example: the data center industry contributes US$9.1 billion to state GDP, supports about 74,000 jobs annually, and Loudoun County received US$895 million in data center tax revenue in FY2025, nearly covering its US$940 million operating budget.
Canada sees the opportunity too. The federal government has committed up to C$2 billion through its Sovereign AI Compute Strategy, including up to C$700 million to help mobilize private investment in new or expanded data centers. Ontario recently welcomed Microsoft's cloud and AI infrastructure expansion, part of a previously announced C$19 billion commitment to Canada, saying it would support 1,250 jobs and expand compute capacity. Alberta has gone further, publishing an AI Data Centres Strategy built around power capacity, sustainable cooling, and economic growth.
The upside is not imaginary.
But it is not automatic either.
A data center can be a tax base, a construction boom, a compute advantage, and a reason for clean-energy investment. It can also be a large private load that soaks up grid capacity, raises costs, annoys neighbours, and leaves fewer permanent jobs than people expected.
The hard part is deciding who pays, where the project connects, and what the community gets back.
How data centers can be less harmful, and maybe useful
Start with siting. A responsible data center should go where the grid can handle it, where new clean power can be built, where cooling is easier, where water stress is low, and where nearby users can take the waste heat. That probably means fewer projects should be judged only by land cost, tax incentives, or how quickly a developer can get connected. Cold places may have a waste-heat advantage. Hot, sunny places may have better solar potential. The right answer depends on the whole system, not one climate variable.
Cooling is the next big variable. In water-stressed areas, closed-loop or waterless cooling should become the default expectation for new AI facilities, even if it has some energy tradeoffs. In cooler regions, outside air and liquid cooling can reduce energy and water pressure. Where water is used, facilities should disclose the source, quality, and local scarcity context. Reclaimed water is different from drinking water. A closed-loop system is different from evaporative cooling. The public should not have to guess.
Some of the load can also be flexible. Training jobs, batch inference, model evaluation, indexing, rendering, and other non-real-time workloads could be shifted away from peak grid stress if the software and contracts are designed for it. A data center with batteries, flexible workloads, and grid coordination can be less of a dumb constant load and more of a participant in the energy system.
The power standard should be stricter than annual renewable-energy matching. A company can buy enough credits over a year and still worsen local fossil generation during peak hours. A better standard is whether the load caused new clean generation or storage to be built, in the region where it is used, at the times it is needed.
There are better deal structures available too. A data center company could pay for the local upgrades it triggers. It could build new power capacity and share some of the benefit with the grid. It could help fund transmission, storage, substations, or demand-response programs. In the best case, a large customer should make the local energy system stronger instead of reserving capacity and leaving residents with the bill.
Then there is heat. Data centers turn electricity into heat, and most facilities treat that heat as a problem to remove. In the right place, it can become useful.
This works best where there is a district heating network or nearby heat demand. In Finland, Fortum's heat recovery project with Microsoft's data centers in Espoo and Kirkkonummi is expected to supply roughly 40% of district heating capacity in those areas. The basic idea is simple: recover waste heat from server halls, upgrade it with heat pumps, and feed it into the local heating network.
This is the part that is easiest to miss in the abstract. The useful object is not the data center by itself. It is the data center plus the heating network beside it.

Fortum and Microsoft also have a short project video that shows the same idea at the city-heating level.
That will not work everywhere. A data center in a warm region without district heating may have little use for low-grade heat. But in cold climates with dense communities, campuses, greenhouses, industrial users, or district heating systems, waste heat can turn a liability into part of the local energy system.
Canada should be unusually interested in this
Canada has a real AI data center opportunity, but it is easy to tell the wrong story about it.
Canada is cold, has lots of land, and has clean power in some provinces. That is a start, not a strategy.
Canada has advantages: cool climate, hydro in some provinces, nuclear in Ontario, land, political stability, proximity to the US, strong AI research, and a need for more domestic compute. Canada also has constraints: grid capacity, transmission timelines, provincial differences in electricity emissions, Indigenous and municipal consultation, water and land-use concerns, and the risk that ratepayers subsidize private infrastructure if policy is sloppy.
TD Economics is blunt on this point: Canada can compete if it figures out how to connect new data centers to power in a timely way, but public support depends on making sure industry pays the cost of integrating these loads rather than pushing it onto households.
We should not reject AI data centers because they use resources. Everything useful does. We should also not wave them through because AI is strategically important. The standard should be higher than that.
For Canada, the best version would look something like this:
Build in provinces and regions where power is clean or getting cleaner. Make the data center pay for the grid upgrades it needs. Use cooling systems suited to local water conditions. Prefer closed-loop or waterless designs in water-sensitive regions. Make workloads flexible where possible. Require transparent reporting on power, water, emissions, and backup generators. Reuse heat where there is real demand. Build new power where needed and share some of the benefit locally. Tie public incentives to local jobs, tax base, Indigenous partnerships, community benefit, and domestic compute access.
That is a lot harder than arguing on the internet about whether AI uses too much water.
It is also much more useful.
A simple scorecard
If a new AI data center is proposed, these are the questions I would want answered before having a strong opinion:
- How much power will it use, and when?
- What generation turns on because of this new load?
- Does it bring new clean power or just claim existing clean supply?
- What grid upgrades are needed, and who pays?
- What cooling system does it use?
- How much water is consumed, from what source, and in what watershed?
- Can non-urgent workloads shift during grid stress?
- What backup generation is used?
- Can waste heat be reused nearby?
- What does the community get besides construction traffic and press releases?
A data center should not be judged only by whether it hosts AI. It should be judged like infrastructure: what does it cost, what does it enable, who benefits, who pays, and what could have been designed better?
The debate gets more useful when we judge AI data centers as specific infrastructure projects, not as one giant category.
A responsible facility in a cold region with clean power, closed-loop cooling, flexible workloads, heat reuse, transparent reporting, and local benefits is a different thing from a rushed facility that strains a fossil-heavy grid, consumes scarce water, raises local rates, and hides behind vague sustainability claims. Same category. Very different reality.
That is the rational optimist version of this argument. Do not pretend the costs are fake. Do not treat every cost as a reason to quit. Put the problems on the table, design around them, and make companies prove the project works for the grid, the water system, and the community around it.
AI infrastructure is probably coming either way. Canada can work toward projects that bring economic benefit while improving power, cooling, heat reuse, and accountability, or watch that investment go somewhere else. The better response is not reflexive approval or reflexive opposition. It is to get serious about what good data centers should look like, then make companies meet that bar.
