• BombOmOm@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    10 months ago

    It’s also fairly cheap to buy 32+ GB of RAM, lots of choices for under $80. Meanwhile, I’m not even sure how you find a video card with 32GB of VRAM (not that you really need this much, 12GB and 16GB are pretty solid for a video card nowadays).

    • Lucy :3@feddit.org
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      10 months ago

      Afaik for consumers only the 5090 has 32GB VRAM. So you’re correct, practically impossible to find. And even if you find it, prone to spontaneous combustion.

      For servers, it tops out at 288GB currently, with the AMD Mi355X.

      • Anivia@feddit.org
        link
        fedilink
        arrow-up
        3
        ·
        10 months ago

        Afaik for consumers only the 5090 has 32GB VRAM

        Only if you don’t count Apple Silicon with its shared RAM/VRAM. Ironically a Mac Mini / Studio is currently the cheapest way to get a GPU with lots of vram for AI

      • A7thStone@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        And they cost more than a high end PC. I’m not spending $3k on a card that can go up in smoke. Not to mention all of the honest reviewers I’ve seen say it’s performance improvements are all smoke and mirrors.