First-of-its-kind US bill would address the environmental costs of the technology, but there’s a long way to go.
one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.
And nobody seems to give a shit. Even people who would normally give a shit about this sort of thing. Even people who do things like denounce Bitcoin mining's waste of energy (and I agree) are not talking about the energy- and water- waste from AI systems.
That article says that OpenAI uses 6% of Des Moines' water.
Meanwhile-
According to Colorado State University research, nearly half of the 204 freshwater basins they studied in the United States may not be able to meet the monthly water demand by 2071.
I guess it depends on how you use chatbots. If you’re just too lazy to click on the first google result you get, it’s wasteful to bother ChatGPT with your question. On the other hand, for complex topics, a single answer may save you quite a lot of googling and following links.
Bitcoin was wasteful with little benefit, but AI has the potential to benefit humanity at large. Maybe ChatGPT itself isn't a great example of that, but their research has gone on to spur lots of advancements in AI, advancement that have allowed AI to make all sorts of breakthroughs in areas like medicine
Yeah, but LLMs like ChatGPT and the like aren't where that advancement is being made. LLMs are driving investment in the technology, but it's just a mostly useless investor target that just happens to run on the same hardware that can be used for useful AI-powered research. Sure, it's pushing the hardware advancement forward maybe 10-15 years faster than it might have otherwise happened, but it's coming with a lot of wasteful baggage as well because LLMs are the golden boy investors want to to throw money at.
True the benefit actually exists here (how much is open for debate)
On the other hand, we should be doing full alarm bells and running around in a panic ramping down every use of energy possible before we leave our 100 surviving progeny a lifeless rock to live on. But humans don't work that way. By the time we are all on board it will be 100 years too late, unfortunately.
Honest question, why is AI bad but TVs aren't? What's the environmental cost of millions of people watching Netflix? Using Instagram? Playing video games? Using search engines?
If you wanna get mad at people using computers for their environmental costs why are you starting with AI?
Bitcoin had legitimate reason to be environmentally concerned about, the algorithm was literally based on proof of wasting energy, and that would scale up overtime, AI is not like that.
It's not secret, people just don't care. Manufacturers publish power and cooling data on spec sheets, but because people are easily wowed by pure garbage masquerading as breakthroughs and "future", they simply ignore the costs and push ahead. Add in the fact that most "AI" startups are actual scams, and you've got a corporate incentive to pretend this isn't doing permanent damage too.
Within years, large AI systems are likely to need as much energy as entire nations.
That doesn't sound like they're taking future hardware optimizations into account, we won't be using GPUs for this purpose forever (as much as Nvidia would like that to be true lol)
Not to mention that increasing usage of AI means AI is producing more useful work in the process, too.
The people running these AIs are paying for the electricity they're using. If the AI isn't doing enough work to make it worth that expense they wouldn't be running them. If the general goal is "reduce electricity usage" then there's no need to target AI, or any other specific use for that matter. Just make electricity in general cost more, and usage will go down. It's basic market forces.
I suspect that most people raging about AIs wouldn't want their energy bill to shoot up, though. They want everyone else to pay for their preferences.
Not that you don't have a point, but there's is this theory, paradox or law or something, it escapes my memory at the time, which says that when technology advances, so do requirements. So what's going to happen is that when hardware is 100x more efficient, the fucking corporations will use 100x more, and nothing gets solved in the pollution front.
I am betting in renewable energy as the best way to combat the environmental issues we're facing.
Anything power saved by hardware design improvements will be consumed by adding more transistors. You will not be seeing a power consumption decrease. Manufacturers of this hardware have been giving talks for the past two years calling for literal power plants to be build co-resident with datacenters.
That was my thought too. I heard a take about how we may see us shift away from GPUs to purpose built PUs as a way to continue process progress now we’re getting pretty small on the silicon scale. Neural nets may be one of these special “PU”s we see.
I'm not sure if future optimization wouldn't bring more demand. At least, that's what my hardware and apps shown in a couple of decades. If another start up would have an ability to train with additional billion or trillion of parameters, I'm sure they would. It also leads to a wider window for poor optimization.
Tbf, talking about the environmental costs of generative AI is just framing.
The issue is the environmental cost of electricity, no matter what it is used for.
If we want this to be considered in consumption then it needs to be part of the electricity price. And of course all other power sources, like combustion motors, need to also price in external costs.
It should be considered, an ultra-light laptop, or a raspberry pi consume WAY less power than a full on gaming rig, and the same can be said between a data server that is used for e-commerce and a server running AI, the AI server has higher power requirements (it may not be as wide of a margin from my first comparison, but there is one), now multiply that AI server and add hundreds more and you start seeing a considerable uptick in power usage.
an ultra-light laptop, or a raspberry pi consume WAY less power than a full on gaming rig, and the same can be said between a data server that is used for e-commerce and a server running AI
And if external costs are priced into the cost of electricity then that will be reflected in the cost of operating these devices.
Also there are far more data servers than servers running AI which increases the total effect they have.
For so long human progress has been limited by population size.
A large part of the reason we lept so far over the past century was the huge increase in population size which allowed for greater subspecialization.
But that population growth is unsustainable if not already well past the practical limit.
If we can successfully get to a point where we have exponential gains in productivity unseen in human history while also decoupling that progress from the massive resources it would require from more humans, we might be able to outpace the collective debts we've racked up as a species without obsessive focus on procreation (like Musk) as necessary to burden the next generation with our fuck ups.
33,000 homes is way less than I'd have thought given the degree to which it is being used. And the promises of future hardware revisions like with photonics means we'll be looking at exponential decreases in energy consumption while also seeing increases in processing power.