I mean literally this. AI is a strong tool when used properly, but we should obviously not lose sight of long term goals in favor of short term opportunities
This was never the fault of AI.
Its always the corporations- by nature, they're designed to go with what gets the most profit at any cost.
Just look at what they do to the meat industry to living creatures and their living employees who have to deal with it.
I don't think people appreciate just how dangerously willing these current AI companies are to set fire to the upcoming few decades of society- all for a little bit of glory right now.
The energy problem is definitely one thing I didn't realize was quite so dire, but we're on the cusp of total loss of control over your own likeness, and these companies really couldn't care less.
I tried this recently in hopes of finding an animation pilot, it was too willing to give me completely wrong answers (the most popular things or even kids shows) or it'd just make a name up. Admittedly, I was using 13b.Q4 models and they are not the newest ones.
I ended up finding what I was looking for by pure coincidence: I did a generic search (finding adult swim pilots (I had combed the wikipedia page and their site already)) and one of the higher results is a reddit thread where someone was looking for the same show I was and they made the same mistake that I made (mistaking a Cartoon Hangover short for an Adult Swim pilot).
After that I tried finding an even older and dumber animation that I had gotten on the PSN during the PS3 era, those terms tripped the AI up because it would only give me videogames.
(Certain things are probably better to ask, I'd say I'm not sure about computation being worth it but then again search is pretty garbage these days unless it's an obvious query that won't be mixed up with other newer/more-popular terms)
de Vries, who now works for the Netherlands’ central bank, estimated that if Google were to integrate generative A.I. into every search, its electricity use would rise to something like twenty-nine billion kilowatt-hours per year. This is more than is consumed by many countries, including Kenya, Guatemala, and Croatia.
Why on earth would they do that? Just cache the common questions.
It’s been estimated that ChatGPT is responding to something like two hundred million requests per day, and, in so doing, is consuming more than half a million kilowatt-hours of electricity. (For comparison’s sake, the average U.S. household consumes twenty-nine kilowatt-hours a day.)
Ok, so the actual real world estimate is somewhere on the order of a million kilowatt-hours, for the entire globe. Even if we assume that's just US, there are 125M households, so that's 4 watt-hours per household per day. A LED lightbulb consumes 8 watts. Turn one of those off for a half-hour and you've balanced out one household's worth of ChatGPT energy use.
This feels very much in the "turn off your lights to do you part for climate change" distraction from industry and air travel. They've mixed and matched units in their comparisons to make it seem like this is a massive amount of electricity, but it's basically irrelevant. Even the big AI-every-search number only works out to 0.6 kwh/day (again, if all search was only done by Americans), which isn't great, but is still on the order of don't spend hours watching a big screen TV or playing on a gaming computer, and compares to the 29 kwh already spent.
Math, because this result is so irrelevant it feels like I've done something wrong:
500,000 kwh/day / 125,000,000 US households = 0.004 kwh/household/day
29,000,000,000 kwh/yr / 365 days/yr / 125,000,000 households = 0.6 kwh/household/day, compared to 29 kwh base
It's a good thing that Google has a massive pre-existing business about caching and updating search responses then. The naming things side of their business could probably use some more work though.
AI models work in a feedback loop. The fact that you're asking the question becomes part of the response next time. They could cache it, but the model is worse off for it.
Also, they are Google/Microsoft/OpenAI. They will do it because they can and nobody is stopping them.
This is AI for search, not AI as a chatbot. And in the search context many requests are functionally similar and can have the same response. You can extract a theme to create contextual breadcrumbs that will be effectively the same as other people doing similar things. People looking for Thai food in Los Angeles will generally follow similar patterns and need similar responses, even if it comes in the form of several successive searches framed as sentences with different word ordering and choices.
And none of this is updating the model (at least not in a real-time sense that would require re-running a cached search), it's all short-term context fed in as additional inputs.
The issue is how the electricity is generated not that it is needed in the first place. Such a great distraction from the real issue that it has got to be big oil that is spinning the story this way.
Let’s all hate on AI and crypto because they are ruining the entire environment and if we just stopped them, all would be fine with the planet again /s.
This site says a single ChatGPT query consumes 0.00396 KWh.
Assume an average LED light bulb is 10 watts, or 0.01 kwh/hr. So if I did the math right, no guarantees there, a single ChatGPT query is roughly equivalent to leaving a light bulb on for 20 minutes.
So if you assume the average light bulb in your house is on a little more than 3 hours a day, if you make 10 ChatGPT queries per day it's the equivalent of adding a new light bulb to your house.
Which is definitely not nothing. But isn't the end of the world either.
It’s also the required energy to train the model. Inference is usually more efficient (sometimes not but almost always significantly more so), because you have no error back propagation or other training specific calculations.
Models probably take 1000 megawatts of energy to train (GPT3 took 284MW by OpenAI’s calculation). That’s not including the web scraping and data cleaning and other associated costs (such as cooling the server farms which is non trivial).
A coal plant takes roughly 364kg - 500kg of coal to generate 1 MWh. So for GPT3 you’d be looking at 103,376 kg (~230 thousand pounds, or 115 US tons) at minimum to train it. Nobody has used it and we’re not looking at the other associated energy costs at this point. For comparison, a typical home may use 6MWh per year. So just training GPT3 could’ve powered 47 homes for an entire year.
Edit: also, it’s not nearly as bad as crypto mining. And as another person says it’s totally moot if we have clean sources of energy to fill the need and the grid can handle it. Unfortunately we have neither right now.
If you amortize training costs over all inference uses, I don't think 1000MW is too crazy. For a model like GPT3 there's likely millions of inference calls to split that cost between.
Artificial intelligence requires a lot of power for much the same reason. The kind of machine learning that produced ChatGPT relies on models that process fantastic amounts of information, and every bit of processing takes energy. When ChatGPT spits out information (or writes someone’s high-school essay), that, too, requires a lot of processing. It’s been estimated that ChatGPT is responding to something like two hundred million requests per day, and, in so doing, is consuming more than half a million kilowatt-hours of electricity. (For comparison’s sake, the average U.S. household consumes twenty-nine kilowatt-hours a day.
So 500 Megawatts a day across the globe? This is all just Data Center use? Not even 1/10th the power of the newest and largest data center's power... out of ~11,000 total data centers.
Existing markets are already struggling to meet demand, the report says. In Northern Virginia, the largest data center market in the world at 3,400MW, availability is running at just 0.2 percent.
So a drop in the bucket for a crazy useful tool using mostly existing infrastructure...
The finding that global data centers likely consumed around 205 terawatt-hours (TWh) in 2018, or 1 percent of global electricity use, lies in stark contrast to earlier extrapolation-based estimates that showed rapidly-rising data center energy use over the past decade (Figure 2).
The typical cost of building a solar power plant is between $0.89 and $1.01 per watt. A 1MW (megawatt) solar farm can cost you between $890,000 and $1.01 million... According to GTM Research, 1 MW solar farms require 6–8 acres to accommodate all the necessary infrastructure and space between panel rows.
$300 million and ~2 square miles (7 for reference) to power the entire world's AI use feels like a non-issue to me. A billionaire could literally fund the entire world's daily consumption and not dent their holdings...
This is concerning, why they just dont stop the never ending updates and just stick with the latest things we have for a moment? Isnt all the tech stuff we have sufficient for the world to keep going?
The bigger companies focus on huge model sizes instead and ever increasing them. Lots of advanced are being made with smaller and more affordable models that can be run on consumer devices but the big companies don't focus on that as it can't generate as much profit.