Yep, it's the RAM, but also just a mismatched value proposition.
I think it's clear at this point Nvidia is trying to have it both ways and gamers are sick of it. They used pandemic shortage prices as an excuse to inflate their entire line's prices, thinking they could just milk the "new normal" without having to change their plans.
But when you move the x070 series out of the mid-tier price bracket ($250-450, let's say), you better meet a more premium standard. Instead, they're throwing mid-tier RAM into a premium-priced project that most customers still feel should be mid-tier priced. It also doesn't help that it's at a time where people generally just have less disposable income.
That is still overpriced i think. Although, much less egregious than what Nv is doing. Launch msrp for a HD7850, which was the same category as the 6700XT today (upper middle tier) was 250 usd. A few years prior the 4850 started at 200 usd. Even the Rx 480 started at only 230 usd. And those were all very decent cards in their time.
The new mining is AI... TSMC is at max capacity. They're not going to waste too many wafers making gaming GPU when AI acceleratora are selling for $30k each
Nvidia over pricing their cards and limiting stock, acting like there is still a gpu shortage from all the crypto bros sucking everything up.
Right now, their competitors are beating them at hundreds of dollars below nvidias mrp like for like with the only true advantage nvidia has is in ray tracing and arguably VR.
It's possible we're approaching another shorter with the AI bubble though for the moment that seems to be pretty far off.
TL;DR Nvidia is trying to sell a card at twice it's value cause greed.
They're beating AMD at ray tracing, upsampling (DLSS vs FSR), VR, and especially streaming (NVENC). For the latter look at the newly announced beta partnership with Twitch and OBS which will bring higher quality transcoding and easier setup only for Nvidia for now and soon AV1 encoding only for Nvidia (at first anyway).
The raw performance is mostly there for AMD with the exception of RT, and FSR has gotten better. But Nvidia is doing Nvidia shit and using the software ecosystem to entrench themselves despite the insane pricing.
Couldn't agree more! Abstracting to a general economic case -- those hundreds of dollars are a double digit percentage of the overall cost! Double digit % cost increase for single digit % performance doesn't quite add up @nvidia :)
Especially with Google going with TPUs for their AI monstrosities it makes less and less sense at large scale for a consumers to pay the Nvidia tax just for CUDA compatibility. Especially with the entrance of things like SYCL that help programmers avoid vendor lock.
I have an all-AMD system, but they have become too expensive as well. Just Nvidia with a 20% discount, safe for the 7900 XTX which is completely out of question for me to begin with.
I have a 2060 super with 8GB. The VRAM is enough currently for FHD gaming - or at least isn't the bottle neck, so 12 GB might be fine with this use case BUT I'm also toying around with AI models and some of the current models already ask for 12 GB VRAM to run the complete model. It's not, that I would never get a 12 GB card as an upgrade, but you'd be sure, that I'd do some research for all alternatives and then it wouldn't be my first choice but a compromise, as it wouldn't future proof me in this regard.
My Nvidia 1070 with 8gb vram is still playing all of my games. Not everything gets Ultra, nor my monitor isn't 4K. Forever I am the "value buyer". It's hard to put money into something that is marginally better though. I thought 16g would be a no-brainer.
Exactly, people get to caught up in the Digital Foundry-ification of ultra max settings running at a perfect ~120 unlocked frames. Relax my dudes and remember the best games of your life were perfect dark with your friends running at 9 FPS.
1080p is fine, medium settings are fine. If the game is good you won't sweat the details.
As someone who really doesn't care much for game graphics I feel that a comment I wrote a few months ago also fits here:
I’ve never really cared much about graphics in video games, and a game can still be great with even the simplest of graphics - see the Faith series, for example. Interesting story and still has some good scares despite the 8-bit graphics.
To me many of these games with retro aesthetics (either because they’re actually retro or the dev decided to go with a retro style) don’t really feel dated, but rather nostalgic and charming in their own special way.
And many other people also don’t seem to care much about graphics. Minecraft and Roblox are very popular despite having very simplistic graphics, and every now and then a new gameplay video about some horror game with a retro aesthetic will pop up on my recommended, and so far I’ve never seen anyone complain about the graphics, only compliments about them being interesting, nostalgic and charming.
Also I have a potato PC, and it can’t run these modern 8K FPS games anyway, so having these games with simpler graphics that I can actually run is nice. But maybe that’s just me.
12gB vram is not a bottleneck in any current games on reasonable settings. There is no playable game/settings combination where a 7800xt's 16gB offers any advantage. Or do you think having 15fps average is more playable than 5fps average(because the 4070s is ram bottlenecked)? Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.
The 4070 super offers significantly superior ray tracing performance, much lower power consumption, superior scaling(and frame generation) technology, better streaming/encoding stuff and even slightly superior rasterization performance to the 7800xt. Are these things worth sacrificing for 100€ less and 4gB vram? For most people they arent.
Amd's offerings are competitive, not better. And the internet should stop sucking their dick, especially when most of the internet, including tech savvy people, dont even use AMD gpus. Hell, LTT even made a series of videos about how they had to "suffer" using AMD gpus, yet they usually join the nvidia shitting circlejerk.
I have an amd 580 card and have bought and recommended AMD gpus to people since the 9500/9700pro series. But my next gpu will almost certainly be an nvidia one. The only reason people are complaining is because nvidia can make a better gpu(as shown by the 4090) but they choose not to. While AMD literally cant make better gpus but they choose to only "competitively" price their gpus, instead of offering something better. Both companies suck.
D4 on Linux. Literally the only bottleneck is it eats 11GB of my 1080Ti's VRAM for breakfast and then still wants lunch and dinner. Plays 4k on high with prefect fps otherwise. Starts glitching like crazy once VRAM is exhausted after 10-15 minutes.
Zero issues on a 20GB card. I understand that shitty code on single game is not exactly universal example, but it is a valid reason to want more VRAM.
Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.
This is exactly what I expect. I have seen what happened to my friends with their GTX 970 when 3.5 GB of VRAM wasn't enough anymore. Even though the cards were still rasterizing quickly enough they weren't useful for certains games anymore. Therefore I recently make sure I go for enough VRAM to extend the useful service life of my cards.
And I'm not just talking about buying AMD, I actually do buy them. I first had the HD 5850 with 1GB, then got my friends HD 5870 also with 1GB (don't remember if I used it in crossfire or just replaced), then two of my friends each sold me their HD 7850 with 2GB for cheap and I ran crossfire, then I bought a new R9 380 with 4GB when a game that was important to me at the time couldn't deal with crossfire well, then I bought a used RX 580 with 8GB and finally the RX 6800 with 16 GB two years ago.
At some point I also bought a used GTX 960 because we were doing some CUDA stuff at University, but that was pretty late, when they weren't current anymore, and it was only used in my Linux server.
You all should check prices comparing dual fan 3070’s to 4070’s they are a $40 difference on Amazon. Crazy to see. They completely borked their pricing scheme trying to get whales and crypto miners to suck their 40 series dry and wound up getting blue balled hard.
Aren’t they taking the 4080 completely off the market too?
I mean yeah when I‘m searching for GPUs I specifically filter out anything that‘s less than 16GB of VRAM. I wouldn‘t even consider buying it for that reason alone.
And here I'm thinking upgrading from two 512mb cards to a GTX 1660 SUPER with 6GB VRAM is going to be good for another 10 years. The heck does someone need 16 gigs for?
AI. But you're right my 4G 5500XT so far is putting up a valiant fight though I kinda dread trying out CP77 again after the big patch it's under spec now. Was a mistake to buy that thing in the first place, should've gone with 8G but I just had to be pigheaded with my old "workstation rule" -- Don't spend more on the GPU than on the CPU.
I haven't paid attention to GPUs since I got my 3080 on release day back in Covid.
Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don't struggle to run a single game on max settings at high frames @ 1440p, what's the benefit that justifies the cost of 20gb VRAM outside of AI workloads?
An actual technical answer: Apparently, it's because while the PS5 and Xbox Series X are technically regular x86-64 architecture, they have a design that allows the GPU and CPU to share a single pool of memory with no loss in performance. This makes it easy to allocate a shit load of RAM for the GPU to store textures very quickly, but it also means that as the games industry shifts from developing for the PS4/Xbox One X first (both of which have separate pools of memory for CPU & GPU) to the PS5/XSX first, VRAM requirements are spiking up because it's a lot easier to port to PC if you just keep the assumption that the GPU can handle storing 10-15 GB of texture data at once instead of needing to refactor your code to reduce VRAM usage.
Current gen consoles becoming the baseline is probably it.
As games running on last gen hardware drop away, and expectations for games rise above 1080p, those Recommended specs quickly become an Absolute Minimum. Plus I think RAM prices have tumbled as well, meaning it's almost Scrooge-like not to offer 16GB on a £579 GPU.
That said, I think the pricing is still much more of an issue than the RAM. People just don't want to pay these ludicrous prices for a GPU.
Perhaps not the biggest market but consumer cards (especially nvidia's) have been the preferred hardware in the offline rendering space -ie animation and vfx- for a good few years now. They're the most logical investment for freelancers and small to mid studios thanks to hardware raytracing. CUDA and later Optix may be anecdotal on the gaming front, but they completely changed the game over here
I don't know about everyone else, but I still play at 1080. It looks fine to me and I care more about frames than fidelity. More VRAM isn't going to help me here so it is not a factor when looking at video cards. Ignoring the fact I just bought a 4070, I wouldn't not skip over a 4070 Super just because it has 12GB of RAM.
This is a card that targets 1440p. It can pull weight at 4k, but I'm not sure if that is justification to slam it for not having the memory for 4k.
It can pull weight at 4k, but I'm not sure if that is justification to slam it for not having the memory for 4k.
There are many games that cut it awfully close with 12GB at 1440p, for some it's actually not enough. And when Nvidia pushes Raytracing as hard as they do, not giving us the little extra memory we need for that is just a dick move.
Whatever this card costs, 12GB of vram is simply not appropriate.
I’m fine playing at 30fps, I don’t really notice much of a difference. For me ram is the biggest influence in a purchase due to the capabilities it opens up for local AI stuff.
If someone says they don't notice a difference between 60 FPS and 120+ FPS, I think... okay, it is diminishing returns, 60 is pretty good. But if someone says they don't notice a difference between 30 and 60... you need to get your eyes checked mate.
My monitor is only 1440p, so it's just what i need. I ordered the Founders Edition card from Best Buy on a whim after I stumbled across it at launch time by coincidence. I'd been mulling over the idea of getting a prebuilt PC to replace my laptop for a few weeks at that point and was on the lookout for sales on ones with a 4070. Guess I'll be building my own instead now.
I think the only reason you'd really need that kind of grunt is on a 4K TV anyway, and even then you can use DLSS or whatever the other one is to upscale.
So many options, with small differences between them, all overpriced to the high heavens. I'm sticking with my GTX 1070 since it serves my needs and I'll likely keep using it a few years beyond that out of spite. It cost $340 at the time I bought it (2016) and I thought that was somewhat overpriced. According to an inflation calculator, that's $430 in today's dollars.
What's going on? It's overpriced and completely unnecessary for most people. There's also a cost of living crisis.
I play every game I want to on high graphics with my old 1070. Unless you're working on very graphically intensive apps or you're a pc master race moron then there's no need for new cards.
I still game 1080p and it looks fine. I'm not dropping 2500 bucks to get a 4k monitor and video card to run it when I won't even register the difference during actual gameplay.
I don't think they care. In fact I think they're going to exit the consumer market eventually, it's just peanuts to them and the only reason they're still catering to it is to use it as field testing (and you're paying them for the privilege which is quite ironic).
This. Corporations are lining up in droves for gpu's to run AI applications. Nvidia doesn't care about regular consumers because we aren't even their primary market anymore, just a bonus to be squeezed.
Right? TPUs make more sense at scale (especially for LLMs & similar). The consumer market is more about hype and being a household name than it is about revenue.