Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?
Remember when eVGA decided they would rather leave the market entirely than spend one more day working with Nvidia?
Really?
GPUs haven’t been reasonably priced since the 1000 series.
And now there’s no coin mining promising some money back.
You mean Nvidia GPUs? I got my 6750XT for 500€, and I think it’s a good price for the performance I get.
That is still overpriced i think. Although, much less egregious than what Nv is doing. Launch msrp for a HD7850, which was the same category as the 6700XT today (upper middle tier) was 250 usd. A few years prior the 4850 started at 200 usd. Even the Rx 480 started at only 230 usd. And those were all very decent cards in their time.
I bought a GTX780 for $500 MSRP circa 2013. I considered that to be crazy expensive at the time, but I was going all out on that system. Currently I run a GTX1080Ti(bought used) with 11GB of VRAM and they want me to spend $600 for 1 more GB of VRAM? PS5 has 16 GB of shared memory, 16GB should be entry level of VRAM for a system thats expected to keep up with this generation of graphics. There’s no reason for Nvidia to do this other than to force users to upgrade sooner.
Funny part is the market is so fucked that reviewers are lauding this a decent deal. I think the 1080Ti will last me until OLED matures and I finally upgrade from a 1080p monitor. According to the steam survey most gamers are in a similar boat.
Yeah right? I got my 6700 XT for just over $400USD. It was a great deal.
Just got my brand new 6800xt for $350, upgrading from a 970 screw Nvidia.
970 Ti doesn’t even exist
Whoops meant SC
That’s a shit deal when the 4070 is €550
deleted by creator
The new mining is AI… TSMC is at max capacity. They’re not going to waste too many wafers making gaming GPU when AI acceleratora are selling for $30k each
ugh
Nvidia over pricing their cards and limiting stock, acting like there is still a gpu shortage from all the crypto bros sucking everything up.
Right now, their competitors are beating them at hundreds of dollars below nvidias mrp like for like with the only true advantage nvidia has is in ray tracing and arguably VR.
It’s possible we’re approaching another shorter with the AI bubble though for the moment that seems to be pretty far off.
TL;DR Nvidia is trying to sell a card at twice it’s value cause greed.
They’re beating AMD at ray tracing, upsampling (DLSS vs FSR), VR, and especially streaming (NVENC). For the latter look at the newly announced beta partnership with Twitch and OBS which will bring higher quality transcoding and easier setup only for Nvidia for now and soon AV1 encoding only for Nvidia (at first anyway).
The raw performance is mostly there for AMD with the exception of RT, and FSR has gotten better. But Nvidia is doing Nvidia shit and using the software ecosystem to entrench themselves despite the insane pricing.
And they beat AMD in efficiency! I’m (not) surprised that people ignore this important aspect which matters in noise, heat and power usage.
Streaming performance is really good on AMD cards, IME. Upscaling is honestly close and getting closer.
I dont think better RT performance is worth the big premium or annoyances nvidia cards bring. Doubly so on Linux.
And AI. They’re beating the pants off AMD at AI.
True enough. I was thinking more of the gaming use case. But even beyond AI and just a general compute workload they’re beating the pants off AMD with CUDA as well.
Couldn’t agree more! Abstracting to a general economic case – those hundreds of dollars are a double digit percentage of the overall cost! Double digit % cost increase for single digit % performance doesn’t quite add up @nvidia :)
Especially with Google going with TPUs for their AI monstrosities it makes less and less sense at large scale for a consumers to pay the Nvidia tax just for CUDA compatibility. Especially with the entrance of things like SYCL that help programmers avoid vendor lock.
Why people no buy our GPU anymore?
Because I can get a whole fucking console for the price of a lower midrange GPU. My only hope is Intel’s Battlemage at this point.
yeah but then you have to play a console without mods or cheap games
try buying a used GPU and game on 1080p monitor and you’ll be able to have great graphics without a lot of money
I will hold onto my $700 3080 until it spits fire, cannot believe I was lucky enough to get it at launch.
Because you are completely against AMD?
I have an all-AMD system, but they have become too expensive as well. Just Nvidia with a 20% discount, safe for the 7900 XTX which is completely out of question for me to begin with.
Cheaper Nvidia ain’t bad. This is coming from someone that uses a 3080Ti and refuses to use AMD GPUs because of shit way in the past. I use their processors though, those are amazing i just wish they had support for thunderbolt.
You can get amd with thunderbolt. The motherboards with thunderbolt headers are bloody expensive, and you’ll need a 200 bucks add in card (which needs to match the motherboard manufacturer I think), so it’s not exactly cheap, but it is possible.
I understand you can shoehorn just about anything you want into a system but that’s not the same as supporting it IMO.
Agreed, and in my experience (Asus board) it’s functional but a bit buggy, so not an easy recommendation. Still, if you want or need team red it’s an option. Price premium sucked, but wasn’t actually noticeably more than if I’d gone team blue. Not sure I’d do it again in hindsight though. Fully functional but only 90% reliable (which is worse than it seems, in the same way a delay of “only” a second every time you do something adds up to a big annoyance) is perhaps not worth it for my use case.
Does Intel allow AMD to license thunderbolt? USB might be better in the long term to support.
Yeah I’m still running GTX970 since the GPU prices went zongers right after buying this. Last generation with decent performance price balance.
Fuck the market. I’ll just stick with this one until it dies on me.
My RTX 4060 has 16GB of RAM. What on earth makes them think people would go for 12GB?
Not being a power of 2 gives me displeasure.
It is in base 6.
And base 3 and 12. But we don’t really use those numbering systems.
I’ve seen people say that card is absurd. I’m not sure who is right there.
My Nvidia 1070 with 8gb vram is still playing all of my games. Not everything gets Ultra, nor my monitor isn’t 4K. Forever I am the “value buyer”. It’s hard to put money into something that is marginally better though. I thought 16g would be a no-brainer.
Exactly, people get to caught up in the Digital Foundry-ification of ultra max settings running at a perfect ~120 unlocked frames. Relax my dudes and remember the best games of your life were perfect dark with your friends running at 9 FPS.
1080p is fine, medium settings are fine. If the game is good you won’t sweat the details.
remember the best games of your life were perfect dark with your friends running at 9 FPS.
The frame rate was shat on at the time and with good reason, that was unplayable for me. Best times were Halo 4-16 local multiplayer.
As someone who really doesn’t care much for game graphics I feel that a comment I wrote a few months ago also fits here:
I’ve never really cared much about graphics in video games, and a game can still be great with even the simplest of graphics - see the Faith series, for example. Interesting story and still has some good scares despite the 8-bit graphics.
To me many of these games with retro aesthetics (either because they’re actually retro or the dev decided to go with a retro style) don’t really feel dated, but rather nostalgic and charming in their own special way.
And many other people also don’t seem to care much about graphics. Minecraft and Roblox are very popular despite having very simplistic graphics, and every now and then a new gameplay video about some horror game with a retro aesthetic will pop up on my recommended, and so far I’ve never seen anyone complain about the graphics, only compliments about them being interesting, nostalgic and charming.
Also I have a potato PC, and it can’t run these modern 8K FPS games anyway, so having these games with simpler graphics that I can actually run is nice. But maybe that’s just me.
I kind of feel the same way about TV resolution. I have a 1080p TV and a 720p TV and I’m honestly fine with them. Sure, there’s better quality out there, but I can always go to the movies if I want that. And I have the advantages of TVs without any ‘smart’ bullshit. They can’t even connect to the internet.
I’m not saying no one else should buy 8k TVs or whatever, if that’s what you want, fine, but there are plenty of people I’ve talked to who feel the same way as me, so I’m glad they haven’t done anything like make us all change to new TVs again like they did when they updated to HD.
I literally have a higher resolution computer monitor than I do TV. My computer monitor costs more than my TV did too!
30fps is fine too on most games…
friend of mine makes do with a gtx960@720p and is perfectly fine with it, the fun games run. even new ones.
maybe an upgrade to digital foundry perfect 120fps would be worth it if it werent so damn expensive nowadays outside the us.
Not to shill for them but Alex makes it a point to run tests and to include optimized settings for non flagship hardware in every review he does. I’m not sure where your digital foundry nomenclatures are coming from.
And no, 30fps is not fine…
i was referring to the op i was responding to
You lost me at 1080p. It’s a basic quality of life thing. Even 1440p is a HUGE upgrade even for regular computer use not even gaming.
I run 4k but I use/need it more work space at work than gaming.
You all should check prices comparing dual fan 3070’s to 4070’s they are a $40 difference on Amazon. Crazy to see. They completely borked their pricing scheme trying to get whales and crypto miners to suck their 40 series dry and wound up getting blue balled hard.
Aren’t they taking the 4080 completely off the market too?
Aren’t they taking the 4080 completely off the market too?
Apparently they stopped production of it months ago. Whatever still exists on shelves is only there because nobody has been buying them.
Honestly this has been the worst 80-class Nvidia card ever. The GTX 480 was a complete joke but even that managed to sell ok.
The RAM is so lame. It really needed more.
Performance exceeding the 3090, but limited by 12 gigs of RAM .
I mean yeah when I‘m searching for GPUs I specifically filter out anything that‘s less than 16GB of VRAM. I wouldn‘t even consider buying it for that reason alone.
And here I’m thinking upgrading from two 512mb cards to a GTX 1660 SUPER with 6GB VRAM is going to be good for another 10 years. The heck does someone need 16 gigs for?
Gaming in 4k or AI (e.g stable diffusion or language models)
Future proofing. GPUs are expensive and I expect to be able to use it for at least the next 7 years, even better if it lasts longer than that.
VR uses a lot of RAM.
Unless your gaming that’s fine.
But if you want to play any newer AAA games (even less than 5-8 years old) or use more than 1080p. You’ll need better.
AI. But you’re right my 4G 5500XT so far is putting up a valiant fight though I kinda dread trying out CP77 again after the big patch it’s under spec now. Was a mistake to buy that thing in the first place, should’ve gone with 8G but I just had to be pigheaded with my old “workstation rule” – Don’t spend more on the GPU than on the CPU.
deleted by creator
And I thought I had the lamest card on the block with my 2GB. …
I haven’t paid attention to GPUs since I got my 3080 on release day back in Covid.
Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don’t struggle to run a single game on max settings at high frames @ 1440p, what’s the benefit that justifies the cost of 20gb VRAM outside of AI workloads?
Lmao
We have your comment: what am I doing with 20gb vram?
And one comment down: it’s actually criminal there is only 20gb vram
Lol
Current gen consoles becoming the baseline is probably it.
As games running on last gen hardware drop away, and expectations for games rise above 1080p, those Recommended specs quickly become an Absolute Minimum. Plus I think RAM prices have tumbled as well, meaning it’s almost Scrooge-like not to offer 16GB on a £579 GPU.
That said, I think the pricing is still much more of an issue than the RAM. People just don’t want to pay these ludicrous prices for a GPU.
I’m maxed on VRAM in VR for the most part with a 3080. It’s my main bottleneck.
If only game developers optimized their games…
The newest hardware is getting powerful enough that devs are banking on people just buying better cards to play their games.
Perhaps not the biggest market but consumer cards (especially nvidia’s) have been the preferred hardware in the offline rendering space -ie animation and vfx- for a good few years now. They’re the most logical investment for freelancers and small to mid studios thanks to hardware raytracing. CUDA and later Optix may be anecdotal on the gaming front, but they completely changed the game over here
GPU rendering and AI.
Personally I need it for video editing & 3D work but I get that’s a niche case compared to the gaming market.
laughs in 6800XT
I don’t know about everyone else, but I still play at 1080. It looks fine to me and I care more about frames than fidelity. More VRAM isn’t going to help me here so it is not a factor when looking at video cards. Ignoring the fact I just bought a 4070, I wouldn’t not skip over a 4070 Super just because it has 12GB of RAM.
This is a card that targets 1440p. It can pull weight at 4k, but I’m not sure if that is justification to slam it for not having the memory for 4k.
It can pull weight at 4k, but I’m not sure if that is justification to slam it for not having the memory for 4k.
There are many games that cut it awfully close with 12GB at 1440p, for some it’s actually not enough. And when Nvidia pushes Raytracing as hard as they do, not giving us the little extra memory we need for that is just a dick move.
Whatever this card costs, 12GB of vram is simply not appropriate.
My monitor is only 1440p, so it’s just what i need. I ordered the Founders Edition card from Best Buy on a whim after I stumbled across it at launch time by coincidence. I’d been mulling over the idea of getting a prebuilt PC to replace my laptop for a few weeks at that point and was on the lookout for sales on ones with a 4070. Guess I’ll be building my own instead now.
I’m fine playing at 30fps, I don’t really notice much of a difference. For me ram is the biggest influence in a purchase due to the capabilities it opens up for local AI stuff.
If someone says they don’t notice a difference between 60 FPS and 120+ FPS, I think… okay, it is diminishing returns, 60 is pretty good. But if someone says they don’t notice a difference between 30 and 60… you need to get your eyes checked mate.
I notice a difference, it’s just not enough to make it a big deal for me. It’s like going from 1080 to 1440, you can see it but it’s not really an issue being on 1080.
It depends on the game, quick action packed stuff you can see the jumping and in something like a shooter it can be a disadvantage.
For something like Slay the Spire tho, totally fine.
I’m at the age where if games require such quick reactions that the difference in FPS matters, I’m going to get my ass handed to me by the younguns anyway…
Well maybe if you had a 240hz monitor… ;)
Totally fair, just worth point out that it can/does make a difference in those games as it can literally mean the difference in firing where someone was rather then where they are because of how long it takes for you to see the next frame.
I think the only reason you’d really need that kind of grunt is on a 4K TV anyway, and even then you can use DLSS or whatever the other one is to upscale.
So many options, with small differences between them, all overpriced to the high heavens. I’m sticking with my GTX 1070 since it serves my needs and I’ll likely keep using it a few years beyond that out of spite. It cost $340 at the time I bought it (2016) and I thought that was somewhat overpriced. According to an inflation calculator, that’s $430 in today’s dollars.
1060 6gb here. Loving life atm.
Ditto. It’s a great card and I don’t feel I’m missing out over what newer cards offer.
It’ll do for the few pc games I play. FFXIV don’t need much to run. Even handles HL Alyx.
insert linus torvalds nvidia clip here
What’s going on? It’s overpriced and completely unnecessary for most people. There’s also a cost of living crisis.
I play every game I want to on high graphics with my old 1070. Unless you’re working on very graphically intensive apps or you’re a pc master race moron then there’s no need for new cards.
I still game 1080p and it looks fine. I’m not dropping 2500 bucks to get a 4k monitor and video card to run it when I won’t even register the difference during actual gameplay.
It was a night and day difference going from a 1060 6gb to a 6700xt. The prices are still kida shit but that goes for everything