Unfortunately, that's the anti-scalper countermeasure. Crippling their crypto mining potential didn't impact scalping very much, so they increased the price with the RTX 40 series. The RTX 40s were much easier to find than the RTX 30s were, so here we are for the RTX 50s. They're already on the edge of what people will pay, so they're less attractive to scalpers. We'll probably see an initial wave of scalped 3090s for $3500-$4000, then it will drop off after a few months and the market will mostly have un-scalped ones with fancy coolers for $2200-$2500 from Zotac, MSI, Gigabyte, etc.
Those genres aren't really known for having brutal performance requirements. You have to play the bleeding edge stuff that adds prototype graphics postprocessing in their ultra or optional settings.
When you compare non RT performance the frame delta is tiny. When you compare RT it's a lot bigger. I think most of the RT implementations are very flawed today and that it's largely snake oil so far, but some people are obsessed.
I will say you can probably undervolt / underclock / power throttle that 4090 and get great frames per watt.
You living in the past, rendering 100% of the frames is called Brute Force Rendering, that's for losers.
With only 2k trump coins our new graphic card can run Cyberpunk 2077, a game from 4 years ago, at 30 fps with RTX ON but you see with DLSS and all the other crap magic we can run at 280 FPS!!! Everything is blurry and ugly as fuck but look at the numbers!!!
Scalpers were basically non existent in the 4xxx series. They're not some boogieman that always raises prices. They work under certain market conditions, conditions which don't currently exist in the GPU space, and there's no particular reason to think this generation will be much different than the last.
Maybe on the initial release, but not for long after.
Maybe I’m stuck in the last decade, but these prices seem insane. I know we’ve yet to see what a 5050 (lol) or 5060 would be capable of or its price point. However launching at $549 as your lowest card feels like a significant amount of the consumer base won’t be able to buy any of these.
Sadly I think this is the new normal. You could buy a decent GPU, or you could buy an entire game console. Unless you have some other reason to need a strong PC, it just doesn't seem worth the investment.
At least Intel are trying to keep their prices low. Until they either catch on, in which case they'll raise prices to match, or they fade out and leave everyone with unsupported hardware.
Actually AMD has said they're ditching their high end options and will also focus on budget and midrange cards. AMD has also promised better raytracing performance (compared to their older cards) so I don't think it will be the new norm if AMD also prices their cards competitively to Intel. The high end cards will be overpriced as it seems like the target audience doesn't care that they're paying shitton of money. But budget and midrange options might slip away from Nvidia and get cheaper, especially if the upscaler crutch breaks and devs have to start doing actual optimizations for their games.
They'll sell out anyways due to lack of good competition. Intel is getting there but still have driver issues, AMD didn't announce their GPU prices yet but their entire strategy is following Nvidia and lowering the price by 10% or something.
Weird completely unrelated question. Do you have any idea why you write "Anyway" as "Anyways"?
It's not just you, it's a lot of people, but unlike most grammar/word modifications it doesn't really make sense to me. Most of the time the modification shortens the word in some way rather than lengthening it. I could be wrong, but I don't remember people writing or saying "anyway" with an added "s" in anyway but ironically 10-15 years ago, and I'm curious where it may be coming from.
So much of nvidia's revenue is now datacenters, I wonder if they even care about consumer sales. Like their consumer level cards are more of an advertising afterthought than actual products.
The performance improvements claims are a bit shady as they compare the old FG technique which only creates one frame for every legit frame, with the next gen FG which can generate up to 3.
All Nvidia performance plots I've seen mention this at the bottom, making comparison very favorable to the 5000 series GPU supposedly.
On the site with the performance graphs, Farcry and Plague Tale should be more representative, if you want to ignore FG. That's still only two games, with first-party benchmarks, so wait for third-party anyway.
Their whole gaming business model now is encouraging devs to stick features that have no hope of rendering quickly in order to sell this new frame generation rubbish.
Nvidia claims the 5070 will give 4090 performance. That's a huge generation uplift if it's true. Of course, we'll have to wait for independent benchmarks to confirm that.
The best ray tracing games I've seen are applying it to older games, like Quake II or Minecraft.