• DickFiasco@sh.itjust.works
    link
    fedilink
    arrow-up
    148
    arrow-down
    1
    ·
    6 days ago

    I’ve had so many problems with Nvidia GPUs on Linux over the years that I now refuse to buy anything Nvidia. AMD cards work flawlessly and get very long-term support.

    • Barbecue Cowboy@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      45
      arrow-down
      1
      ·
      edit-2
      6 days ago

      I’m with you, I know we’ve had a lot of recent Linux converts, but I don’t get why so many who’ve used Linux for years still buy Nvidia.

      Like yeah, there’s going to be some cool stuff, but it’s going to be clunky and temporary.

      • bleistift2@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        28
        ·
        6 days ago

        When people switch to Linux they don’t do a lot of research beforehand. I, for one, didn’t know that Nvidia doesn’t work well with it until I had been using it for years.

        • devfuuu@lemmy.world
          link
          fedilink
          arrow-up
          17
          ·
          6 days ago

          It’s a good way for people to learn about fully hostile companies to the linux ecosystem.

        • DickFiasco@sh.itjust.works
          link
          fedilink
          arrow-up
          11
          ·
          5 days ago

          To be fair, Nvidia supports their newer GPUs well enough, so you may not have any problems for a while. But once they decide to end support for a product line, it’s basically a death sentence for that hardware. That’s what happened to me recently with the 470 driver. Older GPU worked fine until a kernel update broke the driver. There’s nobody fixing it anymore, and they won’t open-source even obsolete drivers.

          • ChogChog@lemmy.world
            link
            fedilink
            arrow-up
            6
            ·
            5 days ago

            I JUST ran into this issue myself. I’m running Proxmox on an old Laptop and wanted to use its 750M…. Which is one of those legacy cards now that I guess means I’d need to downgrade the kernel to use?

            I’m not knowledgeable enough to know the risks or work I’d be looking at to get it working so for now, it’s on hiatus.

            • DickFiasco@sh.itjust.works
              link
              fedilink
              arrow-up
              4
              ·
              5 days ago

              You might be able to use the Nouveau driver with the 750M. Performance won’t be great, but might be sufficient if it’s just for server admin.

      • notfromhere@lemmy.ml
        link
        fedilink
        arrow-up
        19
        arrow-down
        1
        ·
        6 days ago

        Even now, CUDA is gold standard for data science / ML / AI related research and development. AMD is slowly brining around their ROCm platform, and Vulcan is gaining steam in that area. I’d love to ditch my nvidia cards and go exclusively AMD but nvidia supporting CUDA on consumer cards was a seriously smart move that AMD needs to catch up with.

          • AnyOldName3@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            5 days ago

            CUDA is an Nvidia technology and they’ve gone out of their way to make it difficult for a competitor to come up with a compatible implementation. With cross-vendor alternatives like OpenCL and compute shaders, they’ve not put resources into achieving performance parity, so if you write something in both CUDA and OpenCL, and run them both on an Nvidia card, the CUDA-based implementation will go way faster. Most projects prioritise the need to go fast above the need to work on hardware from more than one vendor. Fifteen years ago, an OpenCL-based compute application would run faster on an AMD card than a CUDA-based one would run on an Nvidia card, even if the Nvidia card was a chunk faster in gaming, so it’s not that CUDA’s inherently loads faster. That didn’t give AMD a huge advantage in market share as not very much was going on that cared significantly about GPU compute.

            Also, Nvidia have put a lot of resources over the last fifteen years into adding CUDA support to other people’s projects, so when things did start springing up that needed GPU compute, a lot of them already worked on Nvidia cards.

      • moody@lemmings.world
        link
        fedilink
        arrow-up
        7
        ·
        6 days ago

        People buy Nvidia for different reasons, but not everyone faces any issues with it in Linux, and so they see no reason to change what they’re already familiar with.

    • ashughes@feddit.uk
      link
      fedilink
      arrow-up
      10
      arrow-down
      1
      ·
      6 days ago

      Yeah, I stopped using Nvidia like 20 years ago. I think my last Nvidia card may have been a GeForce MX, then I switched to a Matrox card for a time before landing on ATI/AMD.

      Back then AMD was only just starting their open source driver efforts so the “good” driver was still proprietary, but I stuck with them to support their efforts with my wallet. I’m glad I did because it’s been well over a decade since I had any GPU issues, and I no longer stress about whether the hardware I buy is going to work or not (so long as the Kernel is up to date).

    • DaddleDew@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      6 days ago

      I had an old NVidia gtx 970 on my previous machine when I switched to Linux and it was the source of 95% of my problems.

      It died earlier this year so I finally upgraded to a new machine and put an Intel Arc B580 in it as a stop gap in hopes that video cards prices would regain some sanity eventually in a year or two. No problems whatsoever with it since then.

      Now that AI is about to ruin the GPU market again I decided to bite the bullet and get myself an AMD RX 9070 XT before the prices go through the roof. I ain’t touching NVidia’s cards with a 10 foot pole. I might be able to sell my B580 for the same price I originally bought it for in a few months.

  • kopasz7@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    41
    ·
    edit-2
    5 days ago

    According to the Steam HW survey around 6% of users are still using Pascal (10xx) GPUs. That’s about 8.4 million GPUs losing proprietary driver support. What a waste.

    GPU    %
    1060    1.86
    1050ti  1.43
    1070    0.78
    1050    0.67
    1080    0.5
    1080ti  0.38
    1070ti  0.24
    

    Fixed: 1050 was noted as 1050ti

    • rollerbang@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      5 days ago

      Interesting, I’m about to move one more machine to Linux (the one that’s been off for a while) and I’ve got exactly 10xx GPU inside lol.

          • Korhaka@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            4
            ·
            5 days ago

            You don’t have to updare your drivers though, isn’t this normal with older hardware?

            • Victor@lemmy.world
              link
              fedilink
              arrow-up
              6
              ·
              4 days ago

              You don’t have to updare your drivers though.

              Not sure if you’re on Windows or Linux but, on Linux, we have to actively take explicit actions not to upgrade something when we are upgrading the rest of our system. It takes more or less significant effort to prevent upgrading a specific package, especially when it comes in a sneaky way like this that is hard to judge by the version number alone.

              On Windows you’d be in a situation like “oh, I forgot to update the drivers for three years, well that was lucky.”

          • kopasz7@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            5 days ago

            I believe the same SW version is packaged. Nvidia said they’d drop support in the 580 release, but they shifted it to 590 now.

            The arch issues are another layer of headache by the maintainers changing the package names and people breaking their systems on update when a non-compatible version is pulled replacing the one with still pascal support in it.

            • Victor@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              4 days ago

              Not really a problem of Arch, but of the driver release model, then, IMO. You’d have this issue on Windows too if you just upgraded blindly, right? It’s Nvidia’s fault for not naming their drivers, or versioning/naming them in a way that indicates support for a set of architectures. Not just an incrementing number willy nilly.

              • kopasz7@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                2
                ·
                4 days ago

                It’s 2025, can we not display a warning message in pacman? Or letting it switch from nvidia-590 to nvidia-legacy?

                I’m not an arch user, I admit, I don’t like footguns.

                • Victor@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  4 days ago

                  TIL Arch is a footgun. 🤡 cope. 😉

                  But yeah, I agree, if package maintainers were astute there, a warning would’ve probably been good somehow. Not sure pacman supports pre-install warnings. Maybe? It does support warning about installing a renamed/moved package. But the naming would’ve had to be really weird for everyone involved if the warning would be clear in that case.

      • sbird@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 days ago

        That’s Linus Torvalds, the guy who made the Linux kernel. I think this was some interview he did, but I’m not sure.

  • kbal@fedia.io
    link
    fedilink
    arrow-up
    62
    ·
    6 days ago

    Those are the GPUs they were selling — and a whole lot of people were buying — until about five years ago. Not something you’d expect to suddenly be unsupported. I guess Nvidia must be going broke or something, they can’t even afford to maintain their driver software any more.

    • sbird@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 days ago

      Nvidia isn’t exactly broke…I thought they were the most valuable company in the world? Or the second, sometimes they trade places with Apple

      • kbal@fedia.io
        link
        fedilink
        arrow-up
        1
        ·
        4 days ago

        Poor Nvidia… the AI bubble is going to burst, the gamer market has all kinds of reasons to hate them now, and all they’ll have to console themselves with is several trillion dollars.

    • bleistift2@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      6
      ·
      edit-2
      6 days ago

      I don’t get what needs support, exactly. Maybe I’m not yet fully awake, which tends to make me stupid. But the graphics card doesn’t change. The driver translates OS commands to GPU commands, so if the target is not moving, changes can only be forced by changes to the OS, which puts the responsibility on the Kernel devs. What am I missing?

      • kbal@fedia.io
        link
        fedilink
        arrow-up
        36
        ·
        6 days ago

        The driver needs to interface with the OS kernel which does change, so the driver needs updates. The old Nvidia driver is not open source or free software, so nobody other than Nvidia themselves can practically or legally do it. Nvidia could of course change that if they don’t want to do even the bare minimum of maintenance.

        • bleistift2@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          4
          ·
          6 days ago

          The driver needs to interface with the OS kernel which does change, so the driver needs updates.

          That’s a false implication. The OS just needs to keep the interface to the kernel stable, just like it has to with every other piece of hardware or software. You don’t just double the current you send over USB and expect cable manufacturers to adapt. As the consumer of the API (which the driver is from the kernel’s point of view) you deal with what you get and don’t make demands to the API provider.

          • kbal@fedia.io
            link
            fedilink
            arrow-up
            20
            ·
            6 days ago

            Device drivers are not like other software in at least one important way: They have access to and depend on kernel internals which are not visible to applications, and they need to be rebuilt when those change. Something as huge and complicated as a GPU driver depends on quite a lot of them. The kernel does not provide a stable binary interface for drivers so they will frequently need to be recompiled to work with new versions of linux, and then less frequently the source code also needs modification as things are changed, added to, and improved.

            This is not unique to Linux, it’s pretty normal. But it is a deliberate choice that its developers made, and people generally seem to think it was a good one.

          • balsoft@lemmy.ml
            link
            fedilink
            arrow-up
            12
            ·
            6 days ago

            I don’t generally disagree, but

            You don’t just double the current you send over USB and expect cable manufacturers to adapt

            That’s pretty much how we got to the point where USB is the universal charging standard: by progressively pushing the allowed current from the initially standardized 100 mA all the way to 5 A of today. A few of those pushes were just manufacturers winging it and pushing/pulling significantly more current than what was standardized, assuming the other side will adapt.

          • Korhaka@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 days ago

            People love to say Linux is great for old hardware. But not 10 series Nvidia cards apparently?

      • Hirom@beehaw.org
        link
        fedilink
        arrow-up
        10
        ·
        edit-2
        6 days ago

        Using 10 year old hardware with 10 year old drivers on 10 year old OS require no further work.

        The hardware doesn’t change, but the OS do.

        • Korhaka@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 days ago

          Well it still worked until this update, so few week old OS and driver was also good. It’s Arch so expect it to break. It will probably be fixable, we are Linux users.

      • kbal@fedia.io
        link
        fedilink
        arrow-up
        32
        ·
        6 days ago

        They started 9 years ago, but they remained popular into 2020 and according to wikipedia the last new pascal model was released in 2022. The 1080 and the 1060 are both still pretty high up on the Steam list of the most common GPUs.

        • fuckwit_mcbumcrumble@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          15
          ·
          6 days ago

          What model came out in 2022? The newest I could find was the GT 1010 from 2021 (which is more of a video adapter than an actual graphics card) but that’s the exception. The bulk of them came out in 2016 and 2017 https://www.techpowerup.com/gpu-specs/?f=architecture_Pascal

          Hate to break it to ya, but 2020 was 5 years ago. More than half of these GPUs lifespan ago. Nvidia is a for profit company, not your friend. You can’t expect them to support every single product they’ve ever released forever. And they’re still doing better than AMD in that regard.

          • kopasz7@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            19
            arrow-down
            1
            ·
            edit-2
            6 days ago

            You can’t expect them to support every single product they’ve ever released forever. And they’re still doing better than AMD in that regard.

            If nvidia had the pre-GSP cards’ drivers opensourced at least there would be a chance of maintaining support. But nvidia pulled the plug.

            Intel’s and AMD’s drivers in the Mesa project will continue to receive support.

            For example, just this week: Phoronix: Linux 6.19’s Significant ~30% Performance Boost For Old AMD Radeon GPUs These are GCN1 GPUs from 13yrs ago.

              • kopasz7@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                18
                ·
                6 days ago

                Making them open to contributions was the first step, but ok I won’t engage in this petty tribalism.

                The topic was about nvidia’s closed source drives.

                Valve couldn’t do the same for pascal GPUs. Nobody but nvidia has the reclocking firmware, so even the reverse engineered nouveau NVK drivers are stuck at boot clock speeds.

      • ashughes@feddit.uk
        link
        fedilink
        arrow-up
        24
        ·
        6 days ago

        If they’re going to release things under a proprietary license and send lawyers after individuals just trying to get their hardware to work, then yes, yes I can.

        Don’t want to support it anymore? Fine. Open source it and let the community take over.

  • jaxxed@lemmy.world
    link
    fedilink
    arrow-up
    47
    ·
    6 days ago

    Here is old man me trying to fogure out what PASCAL code there is in the linux codebase, and how NVIDIA gets to drop it.

  • Korhaka@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    5 days ago

    Getting dumped to CLI is just a standard Arch experience in updating anything isn’t it? You asked for it, you got it.

  • fodor@lemmy.zip
    link
    fedilink
    arrow-up
    27
    ·
    5 days ago

    I wasted days of my life getting nVidia to work on Linux. Too much stress. Screw that. Better ways to spend time. If I can’t game, that’s OK too.

      • Horsey@lemmy.world
        link
        fedilink
        arrow-up
        15
        ·
        5 days ago

        AMD is plug and play on Linux. With my 7800XT there isn’t a driver to install. Only issue is that AMD doesn’t make anything that competes with the 5080/5090.

        • Victor@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          5 days ago

          Only “issue” is that AMD doesn’t make anything that competes with the 5080/5090.

          And do you really need the performance of a 5080? Certainly not that of a 5090.

          My 9070 XT runs everything I need at perfectly acceptable rates on maximum settings. AAA games among them.

          • Horsey@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            5 days ago

            That’s such a bad way to look at it. I would’ve bought a 5090 if I could afford it because I want to hold onto the 5090 for almost a decade like I did with my 1080. Depending on prices, it doesn’t make sense to upgrade twice in 10 years because you bought a budget option, and then be stuck trying to sell a budget card. 5090s will hold their value for years to come. Good luck playing AAA titles maxed out in 5 years on a 7800XT.

            • AnyOldName3@lemmy.world
              link
              fedilink
              arrow-up
              6
              ·
              5 days ago

              Generally, you’ll get better results by spending half as much on GPUs twice as often. Games generally aren’t made expecting all their players to have a current-gen top-of-the-line card, so you don’t benefit much from having a top-of-the-line card at first, and then a couple of generations later, usually there’s a card that outperforms the previous top-of-the-line card that costs half as much as it did, so you end up with a better card in the long run.

              • Korhaka@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                5 days ago

                Yeah, I am looking at spending less than I did before though. But when will an under £200 card give like double the performance of a 2070? I don’t want to spend that much for +20%. Unless my current card dies there is little reason to upgrade.

              • Horsey@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                4 days ago

                My 7800XT can’t play Hogwarts Legacy without stuttering (on Linux). I’m really regretting not getting a 5080 at this point.

            • Victor@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              5 days ago

              Good luck playing AAA titles maxed out in 5 years on a 5080 too… 5090 isn’t even considered a consumer card anyway, it’s more like an enthusiast, collector’s item. It’s so expensive compared to its performance value.

              You have to look at performance-to-price ratio. That’s the only metric that matters, and should determine how much you can sell it for when upgrading, and how often you upgrade.

            • Korhaka@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              2
              ·
              5 days ago

              I don’t want to play AAA games now, why would I want to with 5 more years of further enshitification?

      • Baggie@lemmy.zip
        link
        fedilink
        arrow-up
        8
        ·
        5 days ago

        Open source drivers are a major plus, I’ve had a much easier time than my partner on NVIDIA. I mean I make both machines work but the NVIDIA has been a real pig at times.

    • IEatDaFeesh@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      5 days ago

      I switched from a 3080 to a 7900 xt. It’s one of the better decisions I’ve made even though on paper the performances are not too far away.

  • someacnt@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    24
    ·
    5 days ago

    Fuck, what do I do when they inevitably discontinue support for 20xx? Just cry and accept that I no longer have a computer, as every component costs as much as a house? D:

    • Korhaka@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 days ago

      Keep using it, you don’t need them to support it to keep using it. All old driver versions still exist.

    • FrederikNJS@sopuli.xyz
      link
      fedilink
      arrow-up
      2
      ·
      5 days ago

      Start watching the second hand market. Most of my PC components are bought second hand, and at much cheaper than buying any of those components new.

      None of these components are of course bleeding edge, but still sufficient to play any game I want.

      I bought an AMD Radeon RX 5700 XT this summer for 1000 DKK (~€133 or ~$157).

  • commander@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    4 days ago

    That newer open source driver is still far behind but is progressing. Those graphics cards will have a great new life with modern kernels someday

  • Don_alForno@feddit.org
    link
    fedilink
    arrow-up
    12
    ·
    5 days ago

    The last time I updated my driver, BG3 didn’t start anymore. So I really could not care less about driver updates for my 8 years old card.

    But still, fuck nvidia.

  • zod000@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    18
    ·
    6 days ago

    I can’t believe they would do this to poor Borland. I guess I’ll just need to use an AMD GPU for my Turbo Pascal fun.

  • thingsiplay@beehaw.org
    link
    fedilink
    arrow-up
    18
    arrow-down
    1
    ·
    6 days ago

    “Brodie” mentioned. To be fair on the Arch side, they are clear the system could break with an update and you should always read the Arch news in case of manual intervention. You can’t fault Archlinux for users not following the instructions. This is pretty much what Arch stands for.

    • Scoopta@programming.dev
      link
      fedilink
      arrow-up
      10
      ·
      6 days ago

      And IMO if anything this is Nvidia’s doing, arch is just being arch, like it sucks but I also don’t see a problem with arch in this instance.

    • solrize@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      5 days ago

      Brodie

      Thinking Forth was a great book! I’m surprised it came up here though.