• ben@lemmy.zip
      link
      fedilink
      English
      arrow-up
      52
      ·
      23 days ago

      You’re gonna need to sit down for me to tell you about NAND prices

      • Ulrich@feddit.org
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        23 days ago

        Same as everyone else? Because it’s a more cost-effective way of storing data?

        • jim3692@discuss.online
          link
          fedilink
          English
          arrow-up
          16
          ·
          23 days ago

          No. AI workloads benefit from SSD’s high random read/write performance. Also, I guess, more people starting using SSDs for paging/swap, as RAM prices skyrocketed.

          This resulted in an SSD shortage immediately after RAM starting getting expensive. Which in turn caused an HDD shortage, because people need space to store their data.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            22 days ago

            I don’t even know what they’re using the SSDs for.

            Most businesses are too stupid to train their own models from scratch, and won’t use “foreign” ones so they won’t finetune them either.

            On the inference side… SSDs aren’t used for much. Just storing Docker stuff/dependencies and model weights for the initial load, and that’s it. Maybe some data for bulk processing, but that’s no different than existing software. The one niche may be KV cache swapping for re-using prompt prefixes, but this is limited and being obsoleted by new attention mechanism.

            So WTF do they even need SSDs and HDDs for? Honestly it feels like FOMO purchasing.