You’re not productive if you don’t use a lot of AI, says guy who makes all of his money selling AI hardware

  • REDACTED@infosec.pub
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    2 months ago

    Something I don’t understand - AI coding is mostly useful in common code, snippets, easy stuff. What Nvidia is doing (drivers, optimization, chip design, etc.) is something I imagine there is close to zero AI training, so what can they realistically even use it for so much?

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 months ago

      so what can they realistically even use it for so much?

      Burn money on AI tokens so it looks like AI could be profitable some day so people keep investing in AI companies that can then buy Nvidia chips…

      You’re thinking of it like “how can AI make a better product”

      They’re looking at it as “how can we sell more chips”

      Two very different questions with very different answers.

      It’s a house of cards and Nvidia can’t afford to acknowledge no one wants AI or knows how to make it profitable.

    • frongt@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      There’s plenty of driver code available. All of Linux and BSD, plus whatever internal stuff they have. Optimization is pretty generic.

      Chip design maybe not, but I imagine you can train an AI on the principles and generate a bunch of candidates, then benchmark them in simulation.