Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.
Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.
How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?
The simply answer is that your A/C dumps heat outside using big metal fins, it’s not terribly great, but it works well at that scale.
Dissipating it into the air for the amount of heat some data centers need to get rid of doesn’t cut it, so they use evaporative coolers.
The phase change of evaporating water from liquid to gas uses approximately 7x more heat energy than taking room temperature water and getting it up the boiling point itself.
Essentially they stick their large metal fins from the AC into a large pool of water and boil it off. This gets rid of the energy with a much smaller and cheaper system, but uses up water.
Edit: To clarify on the water in your home AC, the water is actually being collected INSIDE on the chilling unit as lowering the temperature decreases the ability for the air to hold water, it’s then being pumped outside. Data centers recirculate most of their air, letting in only a small amount of fresh air that they pre-chill, rather than letting fresh air in from the outside all the time like your home does.
Home hvac units don’t let in fresh air all of the time, they recycle air.
The HVAC system no, the home itself, yes.
Depending on how old your home is of course, newer homes tend to have lower exchange rates.
Also datacenters don’t have windows, or even doors constantly letting people in and out of cooled areas and outside.
My HVAC system has an inside and outside air source.
It’s used like sweating. We lose heat by havibg water evaporate off our skin. Right now get warm water and put it on your arm, then blow on it. It gets cold until it is fully evaporated. For water to change from the liquid to the gaseeous phase it needs energy. Think like water molecules are holding hands in a liquid. If one of them wants to come free and fly through the air it needs to somehow get the energy to break free from the grip of the others first. When water evaporates from your arm it tales this energy in the form of heat. It turns heat and uses it to get to the gaseous phase. As long as there is water on your arm it can be cooled that way.
That’s what data centers do as well. They take water to cool their processors and the let part of it evaporate into the air. That way the parts of the water that remain are like your arm - the get cool quickly.
It’s very effective. But if you live in a small town and next door there’s a massive datacenter that takes out all the groundwater and basically just boils it until it disappears, you might get angry after a while.
I once read of a data company that had their servers pretty far north and that it was cheaper to send people there to control it than it was to cool it. Basically used a ventilation system that took the outside air
What I don’t get is how the water is “consumed”, it’s not like it’s gone right? It evaporates and then just comes back down as rain surely?
Same with water consumption of a sweater or a steak.
There probably is some good reason for measuring it like that but conceptually I don’t get it.
Just cause you use water in one place doesn’t mean it’ll come back in the same place.
It does though doesn’t it
I mean eventually yeah, but not fast enough for you to keep using it that way.
Especially now that air holds more moisture since rising temperatures keep the atmosphere warmer and rain is less frequent.
To add to what others said, it’s a tradeoff.
Your gaming PC not only runs up your electric bill from the wall, but the AC as well. It has to work to get all that heat out.
This is the equivalent of water cooling your PC, and piping it to a hot tub outside. It would heat it and evaporate water faster, but it’s basically free and uses basically no electricity.
That’s the tradeoff. It’s water evaporation instead of heat pumps. It’s trading water usage for lots of electricity usage, which in some cases, is well worth it.
And what if you live in a cold climate, you say? Well, evaporative cooling is most cost efficient in hot and (ironically) dry climates.
Server farms use water to cool computers, it’s like water called computers but on a bigger scale
Aircons condense water from the atmosphere the same way water on your shower mirror happens because the mirror is colder than the fog
If you’re familiar with condensation and the rain cycle it should help you understand further
At this point, I wouldn’t be surprised if it turned out that they’re destroying the environment on purpose for some nefarious purpose. e.g. maybe they think it’s easier to rule the masses if natural ressources are very scarce.
I think it’s more likely that yellow journalism is making an issue out of something that isn’t as big a deal as it is.
Since when does yellow journalism care about the environment?
When haven’t they?
Fear and anger sell.
This AI shit is the leftist version of “illegal immigrants are stealing yur jobs”
To be fair, the “infinite scaling” vision Altman and such are selling is quite a dystopia. And they are the ones pushing it.
It’s not reality at all. But it’s kinda reasonable for people to hate that specifically.
Isn’t everything infinitely scaling?
No.
The path I see forward for ML is small, task specific models running on your smartphone or PC, with some kind of bitnet architecture so it uses basically no power.
That’s the hope, anyway, but all the pieces already exist. Bitnet works, extreme task specific training works with a paper that just came out, NPU frameworks are starting to come together.
If that sounds incompatible with corporate AI, that’s because it is.
No