Well I am shocked, SHOCKED I say! Well, not that shocked.
For the price of one 5090 you could build 2-3 midrange gaming PCs lol. It’s crazy that anyone would even consider buying it unless they’re rich or actually need it for something important.
I bought a secondhand 3090 when the 40 series came out for £750. I really don’t need to upgrade. I can even run the bigger AI models locally as I have a huge amount of VRAM.
Games run great and look great. Why would I upgrade?
I’m waiting to see if Intel or AMD come out with something awesome over the next few years. I’m in no rush.
But then the Nvidia xx90 series have never been for the average consumer and I dont know what gave you that idea.
deleted by creator
What’s wrong with 4k gaming? Just curious
Somehow 4k resolution got a bad rep in the computing world, with people opposing it for both play and productivity.
“You can’t see the difference at 50cm away!” or something like that. Must be bad eyesight I guess.
It’s just kind of unnecessary. Gaming in 1440p on something the size of your average computer monitor, hell even just good ol’ 1080 HD, is more than sufficient. I mean 1080 to 4k sure there’s a difference, but 1440p it’s a lot harder to tell. Nobody cares about your mud puddle reflections cranking along in a game at 120 fps. At least not the normies.
Putting on my dinosaur hat for a second, I spent the first decade of my life gaming in 8/16 bit and 4 color CGA, and I’ve probably spent the last thirty years and god only knows how much money trying to replicate those experiences.
I mean I play at 1440p and I think it’s fine… Well it’s 3440x1440, problem is I can still see the pixels, and my desk is quite deep. Do I NEED 4k? No. Would I prefer if I had it? Hell yes, but not enough to spend huge amount of money that are damaging to an already unrealistic market.
deleted by creator
Have you tried 4k? The difference is definitely noticeable unless you play on like a 20" screen
deleted by creator
Not arguing FPS here lol. Arguing 4k, which you can run in 144hz in a lot of games even without a 5090, you failed to mention if you had tried 4k which I assume you haven’t based on the switch to FPS instead of resolution
I play in 1080p so can’t comment on 4k but I can confirm fps doesn’t seem to affect me after 30fps. I don’t perceive a noticeable difference between 30, 60, 120fps. Haven’t played higher than that. I suspect 4k would probably look better to me than a higher fps though. But I’m happy with 30-60fps and 1080p so…
unless they’re rich or actually need it for something important
Fucking youtubers and crypto miners.
Crypto mining with GPUs is dead, the only relevant mining uses ASICs now, so it would be more accurate to say:
Fucking youtubers and AI.
Fuck I’m old.
I’ve been waiting for a product that makes sense.
I’m still waiting. I can keep waiting
Is this a news story from 4 years ago?
The good games don’t need a high end GPU.
Problem is preordering has been normalized, as has releasing games in pre-alpha state.
Anyone that preorders a digital game is a dummy. Preorders were created to assure you got some of the limited physical stock.
Clair obscur runs like shit on my 3090 at 4k :(
Absolutely. True creative games are made by smaller dev teams that aren’t forcing ray tracing and lifelike graphics. The new Indianna Jones game isn’t a GPU-selling card, and is the only game that I’ve personally had poor performance on with my 3070ti at 1440p.
Terraria minimum specs: “don’t worry bro”
I think the Steam Deck can offer some perspective. If you look at the top games on SD it’s like Baldurs Gate, Elden Ring, Cyberpunk, etc., all games that run REALLY poorly. Gamers don’t care that much.
plus, i have a 3060. and it’s still amazing.
don’t feel the need to upgrade at all.
me neither. best is a 1070. don’t play newer ‘demanding’ games, nor do i have a system ‘worthy’ of a better card anyway.
deleted by creator
Yeah, my 2080ti can run everything sans ray traced stuff perfectly, though I also haven’t had any issues with Indiana Jones or Doom: The Dark Ages.
Akschually, Doom DA needs to have raytracing enabled at all times, and your vcard is in the first nvidia gen that has it. While 10xx and 20xx haven’t shown much of a difference, and both series are still okay for average gaming, there’s the planned divide vcard producers wanted. RTX IS ON ads visuals were fancy at best (imho) while consuming too much resources, and now there’s the first game that doesn’t function without it, pushing consumers to either updgrade their hardware or miss out on big hits. Not the first time it happened, but it gives a sense why there were a lot of media noise about that technology in the beginning.
It seems like gamers have finally realized that the newest GPUs by NVIDIA and AMD are getting out of reach, as a new survey shows that many of them are skipping upgrades this year.
Data on GPU shipments and/or POS sales showing a decline would be much more reliable than a survey.
Surveys can at times suffer from showing what the respondents want to reply as opposed to what they do.
I mean, as written the headline statement is always true.
I am horrified by some of the other takeaways, though:
Nearly 3 in 4 gamers (73%) would choose NVIDIA if all GPU brands performed equally. 57% of gamers have been blocked from buying a GPU due to price hikes or scalping, and 43% have delayed or canceled purchases due to other life expenses like rent and bills. Over 1 in 4 gamers (25%) say $500 is their maximum budget for a GPU today. Nearly 2 in 3 gamers (62%) would switch to cloud gaming full-time if latency were eliminated, and 42% would skip future GPU upgrades entirely if AI upscaling or cloud services met their performance needs.
That last one is especially horrifying. You don’t own games when you cloud game, you simply lease them. We all know what that’s done for the preservation of games. Not to mention encouraging the massive amounts of shovel ware that we get flooded with.
You don’t own games when you cloud game, you simply lease them.
That’s also how it is with a game you purchased to play on your own PC, though. Unless you have it on physical media, your access could be revoked at any time.
I don’t know that cloud gaming moves shovelware in either direction, but it really sucks to see the percentage of people that don’t factor ownership into the process at all, at least on paper.
if latency were eliminated
I’m sure we’d all switch to room temperature fusion for power if we could, too, or use superconductors in our electronics.
That’s the problem with surveys, isn’t it? What’s “latency being eliminated”? On principle it’d be your streamed game responds as quickly as a local game, which is entirely achievable if your target is running a 30fps client on a handheld device versus streaming 60 fps gameplay from a much more powerful server. We can do that now.
But is that “latency free” if you’re comparing it to running something at 240Hz in your gaming PC? With our without frame generation and upscaling? 120 Hz raw? 60Hz on console?
The question isn’t can you get latency free, the question is at what point in that chain does the average survey-anwering gamer start believing the hype about “latency free streaming”?
Which is irrelevant to me, because the real problem with cloud gaming has zero to do with latency.
Pretty sure the scalper situation makes a lot of that data meaningless too. Also if they keep supply significantly lower than demand then it’s still going to sell out. The survey mentions one of the reasons people are skipping is supply issues.
That’s why it’s best to focus on absolute unit shipment numbers/POS.
If total units increased compared to the previous generation launch, then people are still buying GPUs.
Absolute numbers/POS is just going to tell you if supply is matching demand(it isn’t). The survey is telling us that demand is dropping off too. They’re losing from both sides. With the number of AI cards they’re selling by the pallet to data centers I don’t think they really care that much though.
If I keep playing the same games my current CPU and GPU will do me well for a long time
When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal? I don’t understand people upgraded phones every year either. Both of those things are high cost for minimal gains between years. You really need 3+ years for any meaningful gains. Especially over the last few years.
Sticking with 1440p on desktop has gone very well for me. 2160p isn’t worth the costs in money or perf.
Still rocking a GTX 1070 and I plan on using my Graphene OS Pixel 8 Pro till 2030 (only bought it (used ofc) bc my Huawei Mate 20 Pro died on my in October last year 😔)
“When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?” - that’s a really good question because I don’t think normal PC gamers have ever, and still aren’t, like that. It’s basically part of the culture to stretch your GPU to the limit of time so idk who you’re complaining about. Yeah, GPU prices are bullshit rn but let’s not make up stuff
Nah, there was a time when you’d get a new card every two years and it’d be twice as fast for the same price.
Nowadays the new cards are 10% faster for 15% more money.
I bought a new card last year after running a Vega 64 for ages and I honestly think it might last me ten years because things are only getting worse.
It doesn’t help that the gains have been smaller, and the prices higher.
I’ve got a RX 6800 I bought in 2020, and nothing but the 5090 is a significant upgrade, and I’m sure as fuck not paying that kind of money for a video card.
Not to mention the cards have gotten huge and you just about need a nuclear reactor to power them. Melting cables and all.
I just picked up a used RX 6800 XT after doing some research and comparing prices.
The fact that a gpu this old can outperform or match most newer cards at a fraction of the price is insane, but I’m very happy with my purchase. Solid upgrade from my 1070 Ti
I’m in the same boat.
In general, there’s just no way I could ever justify buying a Nvidia card in terms of cost per buck, it’s absolutely ridiculous.
I’ll fork over 4 digits for a gfx when salaries go up by a digit as well.
I have a 6700xt and 5700x and my pc can do vr and play star citizen, they are the most demanding things I do on my pc, why should I spend almost £1000 to get a 5070 or 9070 and an am5 board+processor?
I don’t think they’re actually expecting anyone to upgrade annually. But there’s always someone due for an upgrade, however long it’s been for them. If GPU sales are slowing, that means people are slowing down how often they upgrade.
I just finally upgraded from a 1080 Ti to a 5070 Ti. At high refresh-rate 1440p the 1080 Ti was definitely showing its age and certain games would crash (even with no GPU overclock). Fortunately I was able to get a PNY 5070 Ti for only ~$60 over MSRP at the local Microcenter.
5000 series is a pretty shitty value across the board, but I got a new job (and pay increase) and so it was the right time for me to upgrade after 8 years.
When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?
Somewhere around 1996 when the 3dfx Voodoo came out. Once a year was a relatively conservative upgrade schedule in the late 90s.
Those cards were like what though, $199?
That’s still not cheap when you account for inflation. Of course there’s a world of difference between “not cheap” and what they charge these days.
It’s never been normal to upgrade every year, and it still isn’t. Every three years is probably still more frequent than normal. The issue is there haven’t been reasonable prices for cards for like 8 years, and it’s worse more recently. People who are “due” for an upgrade aren’t because it’s unaffordable.
If consoles can last 6-8 years per gen so can my PC.
Your PC can run 796 of the top 1000 most popular games listed on PCGameBenchmark - at a recommended system level.
That’s more than good enough for me.
I don’t remember exactly when I built this PC but I want to say right before covid, and I haven’t felt any need for an upgrade yet.
Increasingly across many markets, companies are not targeting average or median consumers. They’re only chasing whales, the people who can pay the premium. They’ve decided that more mid tier customers aren’t worth it – just chase the top. It also means a lower need for customer support.
Colour me surprised
Resumes gaming with a 1000-series card
Still rocking an EVGA 980 here.
Back when building my PC, I actually considered getting a 980 Ti. Luckily I did go with the GTX 1070
(they were both similarly priced)
I just paid $400 for a refurbished MSI Gaming Z Trio Radeon RX 6800. The most I’ve ever spent. I never want to spend that much again.
Fuck Nvidia anyways. #teamred
I’ve been on Linux since 2018 (my PC is from 2016) and my next GPUs will always be AMD, unless Intel somehow manages to produce an on par GPU
#teamred
Temu Nvidia is so much better, true. Please support the “underdog” billion dollar company.
no don’t buy hardware on Temu
I support the lesser evil option, yes. It’s not like I have much other choices now, do I? Thanks to fucking Nvidia.
I wish AMD had something like CUDA that my video rendering software used so I could stop using nvidia.
Fuck those guys too honestly. AMD is fueling this bullshit just as much as Nvidia.
Nvidia is one of the most evil companies out there, responsible for killing nearly all other GPU producers destroying the market.
So is AMD with their availability of literally three video cards in stock for all of North America at launch. Which in turn just fuels the scalpers. Downvote this all you want guys, AMD is just as complicit in all of this, they’ve fuelled this bullshit just as much.
Nvidia is singlehandedly responsible for killing all competition but AMD. They destroyed all other GPU companies with the nastiest tactics to dominate the market, only AMD has been able to survive. You can’t blame AMD for chip shortages, it’s the after shock after the covid pandemic. Never ever has there been a higher demand for chips, especially thanks to the rising EV market.
You can’t say AMD is as bad as Nvidia, as Nvidia is the sole reason the market got ruined in the first place. They are the worst of the worst.
And don’t forget diaper Donny, who destroyed international trade with his fucking tariff wars.
In the US, a new RTX 5090 currently costs $2899 at NewEgg, and has a max power draw of 575 watts.
(Lowest price I can find)
… That is a GPU, with roughly the cost and power usage of an entire, quite high end, gaming PC from 5 years ago… or even just a reasonably high end PC from right now.
…
The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures… which has necessitated the invention of intelligent temporal frame upscaling, and frame generation… the whole, originally advertised point of this all was to make hi fidelity 4k gaming an affordable reality.
This reality is a farce.
…
Meanwhile, if you jump down to 1440p, well, I’ve got a future build plan sitting in a NewEgg wishlist right now.
RX 9070 + Minisforum BD795i SE (mobo + non removeable, high end AMD laptop CPU with performance comparable to a 9900X) … so far my pretax total for the whole build is under $1500, and, while I need to double and triple check this, I think the math on the power draw works out to a 650 Watt power supply being all you’d need… potentially with enough room to also add in some extra internal HDD storage drives, ie, you’ve got leftover wattage headroom.
If you want to go a bit over the $1500 mark, you could fit this all in a console sized ITX case.
That is almost half the cost as the RTX 5090 alone, and will get you over 90fps in almost all modern games, with ultra settings at 1440p, though you will have to futz around with intelligent upscaling and frame gen if you want realtime raytracing as well with similar framerates, and realistically, probably wait another quarter or two for AMD driver support and FSR 4 to become a bit more mature and properly implemented in said games.
Or you could swap out for an Nvidia card, but seeing as I’m making a linux gaming pc, you know, for the performance boost from not running Windows, AMD mesa drivers are where you wanna be.
The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures
You clearly don’t know what you’re talking about here. Ray tracing has nothing to do with textures and very few games force you to use RT. What is “allowing” devs to skimp on optimization (which is also questionable, older games weren’t perfect either) is DLSS and other dynamic resolution + upscaling tech
Doom the Dark Ages is possibly what they’re referring to. ID skipped lighting in favour of Ray tracing doing it.
Bethesda Studios also has a tendency to use hd textures on features like grass and terrain which can safely be low res.
There is a fair bit of inefficient code floating around because optimisation is considered more expensive than throwing more hardware at a problem, and not just in games. (Bonus points if you outsource the optimisation to some else’s hardware or the modding community)
That is a prominent example of forced RT… basically, as I described with the TAA example in my other reply…
idTech 8 seems to be the first engine that just literally requires RT for its entire render pipeline to work.
They could theoretically build another version of it off of vulkan-base, to enable you to be able to turn RT off… but that would likely be a massive amount of work.
On the bright side… at least the idTech engines are actually well coded, and they put a lot of time into making the engine actually very good.
I didn’t follow the marketing ecosystem for Doom Dark Ages, but it would have been really shitty if they did not include ‘you need a GPU with RT cores’.
…
On the other end of the engine spectrum:
Bethesda… yeah, they have entirely lost control of their engine, it is mangled mess of nonsense, the latest Oblivion remaster just uses UE to render things slapped on top of Gamebryo, because no one at Bethesda can actually code worth a damn.
Compare that to oh I dunno, the Source engine.
Go play TitanFall 2. 10 year old game now, built on a modified version of the Portal 2 Source engine.
Still looks great, runs very efficiently, can scale down to older hardware.
Ok, now go play HL Alyx. If you don’t have VR, there are mods that do a decent job of converting it into M+K.
Looks great, runs efficiently.
None of them use RT.
Because you don’t need to, if you take the time to actually optimize both your engine and game design.
I meant they also just don’t bother to optimize texture sizes, didn’t mean to imply they are directly related to ray tracing issues.
Also… more and more games are clearly being designed, and marketed, with ray tracing in mind.
Sure, its not absolutely forced on in too many games… but TAA often is forced on, because no one can run raytracing without temporal intelligent upscsling and frame gen…
…and a lot of games just feed the pixel motion vectors from their older TAA implementations into the DLSS / FSR implementations, and don’t bother to recode the TAA into just giving the motion vectors as an optional API that doesn’t actually do AA…
… and they often don’t do that because they designed their entire render pipeline to only work with TAA on, and half the games post procrssing effects would have to be recoded to work without TAA.
So if you summarize all that: the ‘design for raytracing support’ standard is why many games do not let you turn off TAA.
…
That being said: Ray tracing absolutely does only really make a significant visual difference in many (not all, but many) situations… if you have very high res textures.
If you don’t, older light rendering methods work almost as well, and run much, much faster.
Ray tracing involves… you know, light rays, bouncing off of models, with textures on them.
Like… if you have a car with a glossy finish, that is reflecting in its paint the entire scene around it… well, if that reflect map that is being added to the base car texture… if that reflect map is very low res, if it is generating it from a world of low res textures… you might as well just use the old cube map method, or other methods, and not bother turning every reflective surface into a ray traced mirror.
Or, if you’re doing accumulated lighting in a scene with different colors of lights… that effect is going to be more dramatic, more detailed, more noticable in a scene with higher res textures on everything being lit.
…
I could write a 60 page report on this topic, but no one is paying me to, so I’m not going to bother.
Saved up for a couple of years and built the best (consumer grade) non nvidia PC I could, 9070XT, 9950X3D, 64gig of RAM. Pretty much top end everything that isn’t Nvidia or just spamming redundant RAM for no reason. The whole thing still costs less than a single RTX 5090 and on average draws less power too.
And then to stick it to the man further you’re running Linux of course, right?
I tried Mint and Ubuntu but Linux dies a horrific death trying to run newly released hardware so I ended up on ghost spectre.
(I also assume your being sarcastic but I’m still salty about wasting a week trying various pieces of advice to make linux goddamn work)Try Bazzite. Easy, beginner friendly, but very God hardware support and up to date.
Levelone techs had relevant guidance.
Kernel 6.14 or greater Mesa 25.1 or greater
Ubuntu and Mint idt have those yet hence your difficult time.
Yep, thats gonna be significantly more powerful than my planned build… and likely somewhere between 500 to 1000 more expensive… but yep, that is how absurd this is, that all of that is still less expensive than a 5090 RTX.
I’m guessing you could get all of that to work with a 750 W PSU, 850 W if you also want to have a bunch of storage drives or a lot of cooling, but yeah, you’d only need that full wattage for running raytracing in 4k.
Does that sound about right?
Eitherway… yeah… imagine an alternate timeline where marketing and industry direction isn’t bullshit, where people actually admit things like:
Consoles cannot really do what they claim to do at 4K… at actual 4K.
They use checkerboard upscaling, so basically they’re actually running at 2K and scaling up, and its actually less than 2K in demanding raytraced games, because they’re actually using FSR or DLSS as well, oh and the base graphics settings are a mix of what PC gamers would call medium and high, but they don’t show console gamers real graphics settings menus, so they don’t know that.
Maybe, maybe we could have tried to focus on just perfecting frame per watt and frame per $ efficiency at 2K instead of baffling us with marketing bs and claiming we can just leapfrog to 4K, and more recently, telling people 8K displays make any goddamned sense at all, when in 95% of home setup situations, of any kind, they have no physically possible perceptible gains.
1000W PSU for theoretical maximum draw of all components at once with a good safety margin. But even when running a render I’ve never seen it break 500W.
Still on a 1060 over here.
Sure, I may have to limit FFXIV to 30fps in summer to stop it crashing, but it still runs.
They are talking about skipping 1 or 2 generations not taking 10 years off
Hey, it’s not 2026 just yet!
Hey, I’m also on a 1060 still! Admittedly I hardly game anymore, although I am considering another Skyrim playthrough.
I’m running Linux for everything and my GTX 1070 is still chugging along trying to power my 1440p 144hz monitor ^^’
Well, I mostly just play strategy games and CS2 (which I do have to run on almost the lowest possible settings without FSR. I basically turn everything to lowest except for lowest still AA setting and dynamic shadows to not have a disadvantage and get 110 - 180 fps depending on the situation)
But I’m planning on buying a used Radeon 9070 XT and just inserting it into my current build (i7 6800k based lololol) and on eventually buying a new build around it
(A 750W 80 Plus Platinum PSU should be able to handle a new 970 XT)