This is getting out of hand. The other day I saw the requirements for “Hell is Us” game and it’s ridiculous. My RX6600 can’t play anything anymore. I’ve downloaded several PS3 ROMs and now I’m playing old games. So much better than this insanity. This is probably what I’m going to be doing now, play old games.

Edit: I wanted to edit the post for more context and to vent/rant a little.

I don’t want to say I made a mistake, but I buy everything used, and I have scored a good deal on two 27" 4k monitors from Facebook marketplace. Got both monitors for $120.

They’re $800 on Amazon used. Great monitors and I love 4k. I also bought an RX6600 AMD GPU for $100 from Facebook. It was almost new. The owner upgraded and wanted to get rid of it. My whole build was very cheap compared to what I see some folks get (genuinely happy for those who can afford it. Life is too short. Enjoy it while you can).

I can’t afford these high end GPUs, but now very few games work on low settings and I’d get something like 20 FPS max. My friend gave me access to his steam library and I wanted to play Indiana Jones the other day, and it was an “omfg, wtf is this horrible shit” moment.
I’m so sick of this shit!

I don’t regret buying any of these, but man it sucks that the games I want to play barely even work.
So, now I’m emulating and it’s actually pretty awesome. I’ve missed out on so many games in my youth so now I’m just going to catch up on what I’ve missed out on. Everything works in 4k now and I’m getting my full 60FPS and I’m having so much fun.

  • lorty@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    5 days ago

    You want to play at 4k on an old low end card. I’m sorry but this one is on you.

  • hedgehog@ttrpg.network
    link
    fedilink
    arrow-up
    1
    ·
    6 days ago

    Have you tried just setting the resolution to 1920x1080 or are you literally trying to run AAA games at 4K on a card that was targeting 1080p when it was released, 4 and a half years ago?

    • DonutsRMeh@lemmy.worldOP
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      6 days ago

      I didn’t mention that in the op, but Indiana Jones was running like shit on 1080 low settings. The fucking game forces DLSS. This is where gaming heading, forced DLSS and forced garbage so we are forced to buy expensive shit

      • Mesophar@pawb.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 days ago

        That’s just on Indiana Jones and bad optimization. There are still plenty of other, newer games that should run perfectly fine. Of course the big, chunky games that are marketed as “Look at how graphically intense this is! Look at the ray tracing!” are going to run poorly.

        Though I will absolutely agree that a lot of studios are throwing optimization out the window when developing new games, just relying on the latest hardware to power through it.

  • JohnnyWanker@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    6 days ago

    It’s not just insane power requirements. It’s putting graphics bling ahead of playability and fun.

    I recently bought Doom 2016 and Mosa Lina. I’ve had more fun with the latter than the former, even if I’ve been a Doom and Doom II player all my life.

  • JoYo@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 days ago

    4k gaming is a scam. I’ve been using 1440p for 20 years and never had issues running every game that comes out at max settings.

      • Sylvartas@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 days ago

        It is a scam if you’re buying 27" monitors like op. You can only cram so much dpi in a monitor before you get diminishing returns. I’ve been playing in 1440p, 27" for a while, and can barely see the pixels if I put my eyes 10 cm away from the screen (and I’ve been playing arma reforger, so there’s been a lot of squinting at bushes through a high-powered scope lately).

        I’ve also used a 4k, 32" screen for a long time at work (in gamedev, so I wasn’t looking at excel files either… Well, actually I also was but that’s besides the point) and couldn’t really tell the difference with my home setup on that front (though I admit 32" at 1440p doesn’t look great sometimes, I also tried that for a while). Really, the most noticeable things were the HDR and the ludicrous fps I could get from having a top-of-the-line CPU and GPU (and 128 Go RAM also helped a bit I guess)

        • dogs0n@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          ·
          3 days ago

          Good point, I didn’t keep ops display size in mind when i read this comment of “4k is a scam”, which may have been their point.

          In my reply, I’m saying 4k isn’t a scam in general.

  • Scrubbles@poptalk.scrubbles.tech
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 days ago

    Okay I’m going to go against the grain, and will probably get downvoted to hell, but this is not new. This is PC gaming. This has always been PC gaming. Hot take - you don’t need 4k@60fps to be able to have fun playing games.

    New games require top of the line hardware (or hardware that doesn’t even exist yet) for high-ultra settings. Always have, always will. (Hell, we had an entire meme about ‘can it run crysis’, a game that literally could only play on low-medium on even the highest level machines for a few years) Game makers want to make their games not just work now, but want them to look great in 5 years too. Unless you have shelled out over a grand this year for the absolute latest GPU, you should not expect any new game to run on great settings.

    In fact, I do keep my PC fairly bleeding edge and I can’t drive more than High settings on most games - and that’s okay. Eventually I’ll play them on Ultra, when hardware catches up. It’s fine.

    And as for low to mid level hardware I was there too - and that’s just PC gaming friend. I played Borderlands and Left4Dead the year they came out on a very old Radeon card at 640x480 in windowed mode, medium settings, at about 40fps.

    Again, this is just what PC gaming is. If you want crisp ultra graphics, you’re gonna have to shell out the ultra payments. Otherwise, fine tuning low to medium payments, becoming okay with sub 60fps, this is all fairly normal.

    Personally, when I upgrade I find great joy in going back and “rediscovering” some of the older games and playing them on ultra for the first time.

    • WereCat@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      6 days ago

      I disagree and I’m too lazy to explain but in short, it’s not about High/Ultra settings, that’s just name for the settings. It’s about how the games look, play, etc… vs how they perform. And I don’t remember that PC gaming was ever before so bad even when we’ve got shitty console ports.

      • dogs0n@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        6 days ago

        You were so lazy it sounds like you agree with the og comment, not disagree. In essence, I think your comment aligns with the one you are replying to.

        (Calling u so lazy is a light hearted joke that joins into my point)

        • WereCat@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          6 days ago

          4:30AM 2.6.2025 BC (before coffee) morning me wrote that comment.

          I honestly just glanced at and dreamt the comment I was replying to.