• TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      33
      arrow-down
      2
      ·
      edit-2
      1 month ago

      People buy Nvidia no matter what. Even when they aren’t the best choice. Then those same people complain about Nvidia doing the anticompetitive things they do.

      The best is when people cheer for AMD making something great, only so they can buy an Nvidia card cheaper, as if the only reason AMD exists is to subsidise their Nvidia purchase!

      Nvidia’s greatest asset is the mindshare they have.

      • 9point6@lemmy.world
        link
        fedilink
        arrow-up
        30
        ·
        edit-2
        1 month ago

        Well that and CUDA still means a load of professionals in various fields are stuck using Nvidia whether they like it or not. This means data centers are incentivised to go with Nvidia if they want those customers, which ultimately means if someone gonna work on code/tools that run in those data centers, you want the same architecture on your local machine for development and testing.

        It’s getting better, but the gap is still real. Hopefully the guys that are working on SCALE can actually get it working on the CDNA GPUs one day, since data centers are where a lot of the CUDA is running or perhaps the UDNA stuff AMD just announced will enable this.

        The fact this is all hinging on the third party that develops SCALE, should highlight that AMD still doesn’t seem to be playing the same game as Nvidia, which is why we’re still in this position.

        • TheGrandNagus@lemmy.world
          link
          fedilink
          English
          arrow-up
          17
          ·
          edit-2
          1 month ago

          Definitely. CUDA has had a long headstart, and Nvidia were very clever in getting it entrenched early on, particularly in universities and such. It’s also just… generally does the job.

          My above comment was purely on the gaming side

      • atro_city@fedia.io
        link
        fedilink
        arrow-up
        13
        ·
        1 month ago

        100%

        “I want change!”

        *Doesn’t do anything to change*

        “Why hasn’t anything changed?”

      • Lasherz12@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        1 month ago

        I would have much preferred giving AMD money instead, but at their best the lack of DLSS performance was meaningful when everyone thought Cyberpunk was the new standard of graphical fidelity with the 6000/3000 series.

    • NeilBrü@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      25 days ago

      The linear algebraic computations performed on their GPU’s tensor cores (since the Turing era) combined with their CUDA and cuDNN software stack have the fastest performance in training deep neural network algorithms.

      That may not last forever, but it’s the best in terms of dollars per TOPS an average DNN developer like myself has access to currently.