If we managed an optimistic pace of doubling every year that'd only take.... 40 years. The last few survivors on desert world can ask it if it was worth it
Rather amusing prediction that despite the obscene amount of resources being spent on AI compute already, it's apparently reasonable to expect to spend 1,000,000x that in the "near future".
There might actually be nothing bad about the Torment Nexus, and the classic sci-fi novel “Don’t Create The Torment Nexus” was nonsense. We shouldn’t be making policy decisions based off of that.
Yes, we know (there are papers about it) that for LLMs every increase of capabilities we need exponentially more data to train it. But don't worry, we only consumed half the worlds data to train LLMs, still a lot of places to go ;).
Interesting. I recall a phenomenon by which inorganic matter was given a series of criterion and it adapted based on changes from said environment, eventually forming data which it then learned from over a period of millions of years.
It then used that information to build the world wide web in the lifetime of a single organism and cast doubt on others trying to emulate it.
We do optimize, it's just that when you decrease the energy for computations by half, you just do twice the computations to iterate faster instead of using half the energy.