• jj4211@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 days ago

    More to the point, the casual consumer isn’t going to dig into the nitty gritty of running models locally and not a single major player is eager to help them do it (they all want to lock the users into their datacenters and subscription opportunities).

    On the Dell keeping NPUs in their laptops, they don’t really have much of a choice if they want modern processors, Intel and AMD are all-in on it still.

    • Goodeye8@piefed.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Setting up a local model was specifically about people who take privacy and security seriously because that often requires sacrificing convenience, which in this case would be having to build a suitable server and learning the necessary know-how of setting up your own local model. Casual consumers don’t really think about privacy so they’re going to go with the most convenient option, which is whatever service the major players will provide.

      As for Dell keeping the NPUs I forgot they’re going to be bundled with processors.

      • jj4211@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        My general point is that discussing the intricacies of potential local AI model usage is way over the head of the people that would even in theory care about the facile “AI PC” marketing message. Since no one is making it trivial for the casual user to actually do anything with those NPUs, then it’s all a moot point for this sort of marketing. Even if there were an enthusiast market that would use those embedded NPUs without a distinct more capable infrastructure, they wouldn’t be swayed/satisfied with just ‘AI PC’ or ‘Copilot+’, they’d want to know specs rather than a boolean yes/no for ‘AI’.