Well I am shocked, SHOCKED I say! Well, not that shocked.

  • ChickenLadyLovesLife@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    20 days ago

    I’m enjoying the fact that I haven’t played video games in about 20 years now. The last one I played regularly was Quake II for Super Nintendo (and that was a few years old even then). If I ever get back into gaming, I can just pick up where I left off and play a bunch of new-to-me games on ancient, cheap technology. Like, were gamers less happy in 2005 than they are today? I don’t think so.

    Shit, I still play Civilization III from time to time and it’s fine.

    • GameGod@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      20 days ago

      That you misremembered the generation of Nintendo console that Quake 2 was on makes this the perfect chefs kiss millennial boomer comment, lol.

  • chunes@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    20 days ago

    I stopped maintaining a AAA-capable rig in 2016. I’ve been playing indies since and haven’t felt left out whatsoever.

    • MotoAsh@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      20 days ago

      Don’t worry, you haven’t missed anything. Sure, the games are prettier, but most of them are designed and written more poorly than 99% of indie titles…

      • Honytawk@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        19 days ago

        The majority sure, but there are some gems though.

        Baldurs Gate 3, Clair Obscur: Expedition 33, Doom Eternal, Elden Ring, God Of War, … for example

        You can always wait for a couple of years before playing them, but saying they didn’t miss anything is a gross understatement.

      • JustEnoughDucks@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        19 days ago

        It’s funny, because often they aren’t prettier. Well optimized and well made games from 5 or even 10 years ago often look on par better than the majority of AAA slop pushed out now (obviously with exceptions of some really good looking games like space marine and some others) and the disk size is still 10x what it was. They are just unrefined and unoptimized and try to use computationally expensive filters, lighting, sharpening, and antialiasing to make up for the mediocre quality.

    • tea@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      20 days ago

      Indies are great. I can play AAA titles but don’t really ever… It seems like that is where the folks with the most creativity are focusing their energy anyways.

  • gravitas_deficiency@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    19 days ago

    Nvidia doesn’t really care about the high-end gamer demographic nearly as much as they used to, because it’s no longer their bread and butter. Nvidia’s cash cow at this point is supplying hardware for ML data centers. It’s an order of magnitude more lucrative than serving consumer + enthusiast market.

    So my next card is probably gonna be an RX 9070XT.

    • ameancow@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      19 days ago

      even the RX9070 is running around $900 USD, I cannot fathom affording even state-of-the-art gaming from years ago at this point. I am still using a GTX1660 and playing games from years ago I never got around to and having a grand time. Most adults I know are in the same boat and either not even considering upgrading their PC or they’re playing their kid’s console games.

      Every year we say “Gonna look into upgrading” but every year prices go up and wages stay the same (or disappear entirely as private-equity ravages the business world, digesting every company that isn’t also a private equity predator) and the prices of just living and eating are insane, so at this rate, a lot of us might start reading again.

      • jacksilver@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        19 days ago

        It makes me wonder if this will bring more people back to consoles. The library may be more limiting, but when a console costs less than just a gpu, itll be more tempting.

  • Deflated0ne@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    19 days ago

    Ah capitalism…

    Endless infinite growth forever on a fragile and very much finite planet where wages are suppressed and most money is intentionally funneled into the coffers of a small handful of people who are already so wealthy that their descendants 5 generations down the line will still be some of the richest people on the planet.

  • GrindingGears@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    20 days ago

    The PC industry has turned into a scuzzy hellscape for average joes that just want to have decent options at realistic prices. They don’t even care about gaming anymore, it’s about YouTube and BitcoinBruhzz now.

    I’ve still got a still pretty decent setup (5800x3d 4070ti), but it’s the last stand for this guy I’m afraid. Looking over the past decade or so, I’ve honestly had better gaming experiences on consoles for mere fractions of the price of a PC build. Mods and PC master race nonsense aside. Sure you don’t need a subscription for online PC playing (I rarely play online), but you can barely get a processor for what a PS5 costs anymore. Let alone a video card, which is upwards of a lot of people’s take home pay for a month, the way things are going.

  • finitebanjo@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    19 days ago

    I’m ngl, finances had no impact on my decisions to stay at 3080. Performance and support did. Everything I want to play runs at least 60 to 180 fps with my current loadout. I’m also afraid once Windows 10 LTSC dies I won’t be able to use a high end GPU with Linux anyways.

    • MisterCD@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      19 days ago

      You can always side-grade to AMD. I was using a 3070 and ditched Windows for Kubuntu and while it was very usable, I would get the slightest input lag and had to make sure the compositor (desktop effects) was turned off when playing a game.

      After some research I decided to side-grade to the 6800 and it’s a night and day difference. Buttery smooth gaming. It performs better with compositor on than Nvidia did with it off. I know 6800 isn’t high end but it’s no slouch either. AMD is king on Linux.

      • Bakkoda@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        20 days ago

        I had lost all interest in games for a while. Desktop just ended up with me tinkering in the homelab. Steam deck has been so great to fall in love with gaming again.

  • tiredofsametab@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    20 days ago

    I’m on a 2080 or 2090 (I forget which). I thought I’d upgrade to the 40xx now that 5090s are out. I looked at the prices and absolutely not. The 5090s are around 500k JPY, and ordering from the US would work out to about the same with exchange, tax, and any possible tariff that exists this week. Salaries here are also much lower than in the west as well on average even for those of us in software.

    4070s are still around 100k which is cheaper than last time I looked at 250k ish.

    Price aggregator site in Japan if you want to play around: https://kakaku.com/pc/videocard/itemlist.aspx?pdf_Spec103=500 On the left, you’ll see the cards to select and the prices are obvious on the screen.

  • monogram@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    20 days ago

    I downgraded from a gtx1060 to a ryzen 5000g igpu terraria & factorio don’t need much.

  • Dammam No. 7@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    19 days ago

    I just looked up the price and I was “Yikes!”. You can get a PS5 Pro + optional Blu-ray drive, Steam Deck OLED, Nintendo Switch 2 and still have plenty of money left to spend on games.

  • Demognomicon@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    19 days ago

    I have a 4090. I don’t see any reason to pay $4K+ for fake frames and a few % better performance. Maybe post Trump next gen and/or if prices become reasonable and cables stop melting.

    • boonhet@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      19 days ago

      fake frames

      And that’s my main problem with what the industry has become. Nvidia always had sizable jumps generation to generation, in raw performance. They STILL get better raw performance, but now it’s nowhere near impressive enough and they have to add their fake frame technologies into their graphs. Don’t get me wrong, they always had questionable marketing tactics, but now it’s getting even worse.

      No idea when I’m replacing my 3060ti, but it won’t be nVidia.

    • Critical_Thinker@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      19 days ago

      I don’t think the 5090 has been 4k in months in terms of average sale price. 4k was basically March. 3k is pretty common now as a listed scalp price, and completed sales on fleabay seem to be 2600-2800 commonly now.

      The problem is that 2k was too much to begin with though. It should be cheaper, but they are selling ML cards at such a markup with true literal endless demand currently, there’s zero reason to put any focus at all on the gaming segment beyond a token offering that raises the margin for them, so business wise they are doing great I guess?

      As a 9070xt and 6800xt owner, it feels like AMD is practically done with the gpu market. It just sucks for everyone that the gpu monopoly is here, presumably to stay. Feels like backroom deals creating a noncompetitive landscape must be prevalent, plus a total stranglehold with artificial monopoly of code compatibility from nvidia’s side make hardware irrelevant.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        19 days ago

        One issue is everyone is supply constrained by TSMC. Even Arc Battlemage is OOS at MSRP.

        I bet Intel is kicking themselves for using TSMC. It kinda made sense when they decided years ago, but holy heck, they’d be swimming in market share if they used their own fabs instead (and kept the bigger die).

        I feel like another is… marketing?

        Like, many buyers just impulse buy, or go with what some shill recommended in a feed. Doesn’t matter how competitive anything is anymore.

      • finitebanjo@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        19 days ago

        Technically Intel is also releasing some cheapo GPUs in similar capability to nVidia but they all have the same manufacturers anyways.

        • Critical_Thinker@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          19 days ago

          There’s major issues with those GPUs in some commonplace use cases and they have major scalping issues. Sure in some use cases there’s zero issues, but this aint like the early 2000s when there were many brands that all basically worked.

          Now you’re either nvidia with every feature, amd with most features (kinda like a store brand), or intel with major compatibility flaws with specific games because it’s technically a GPU.