• kurcatovium@lemm.ee
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      3 months ago

      Well, I’m on the other side and am happy AMD don’t waste resources on high end cards. Vast majority of people use low to mid range cards anyway, so why waste millions to just compete with nvidia, while most enthusiast people will buy nv anyway (because dlss, rtx, whatever hurr durr)? It’s just waste of money for negligible returns. I don’t even know one single person in my social circle that would have 4k/120Hz display anyway. Most (basically everyone) has either 1080p/144 or 4k/60 and that’s mid range happy rodeo. I do have 1440p/144 and I feel like weirdo among my friends (why you need this?).

        • kurcatovium@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 months ago

          Well, it is net waste for the company at some point. AMD is supreme at CPU now, their lineup is awesome for years. But look at last few graphic card generations. When was the last time AMD really competed in enthusiast market? It’s been falling short for quite some time so now they admitted it, cut r&d cost and just skipped high end while focusing on most selling segment. And from the reviews it looks like it paid of as 9070 looks like a hell of an upgrade!

          Don’t get me wrong. Nv having no opponent is not a win, because they can and will charge even more for their top of the range. But I’m just saying this was a good decision for AMD.

    • SkyeStarfall@lemmy.blahaj.zone
      cake
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      3 months ago

      If they were on the position to produce it, they would have. I highly doubt they chose to not make high end cards “just because”, especially because that’s where the profit margins would be the highest

      I think that you’re simply expecting a bit too much, and a little too far ahead in terms of your requirements. 4K RT 120fps is something that’s just barely starting to get achievable at the absolute highest end, and even then not without compromises

    • Poopfeast420@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      edit-2
      3 months ago

      What’s your issue with Linux compatibility and NVIDIA?

      I know the drivers are proprietary and not as good as AMD, but my only issue a year or two ago with a 3080 was VRR with multiple monitors, which is supposed to work now.

        • 9488fcea02a9@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          edit-2
          3 months ago

          When you say “unrecoverable” do you mean the graphical desktop didnt load? Or you couldnt even log in to the terminal?

          A lot of newbies assume that not getting to a graphical session means the OS is dead and nuke it and start over. When in a lot of cases, you can just switch to the command line and troubleshoot or roll back the broken drivers

          Just for future reference if you try linux again

            • swab148@startrek.website
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              1
              ·
              3 months ago

              Bazzite being atomic and hard to fix is actually what pushed me to try Arch. I’d been a Debian guy for ~15 years, but I just built my first new desktop in just about the same amount of time, and I certainly didn’t want it to feel old! So I tried Bazzite, which worked until some update broke some driver, and like you, rolling back didn’t work, and I couldn’t find anything online that would help me fix it, so I just said “Fuck it” and went straight to Arch. It’s been pretty good so far!