

The issue is that to the extent that might even make sense, no major player is actually doing anything to help that happen. Every big player is exclusively focused on taking AI use cases into their datacenters, because that’s the way to maintain control and demand subscriptions.
If you did do it, then the users would complain that the ‘AI feature’ as executed on their puny NPU is really slow compared to what the online alternative does.
So that scenario is a hypothetical, and they are trying to make sales based on now. ‘AI PC’ doesn’t make any sense because people imagine what you describe, but in reality just cannot tell a difference because nothing works any differently for their ‘AI experience’. Their experience is going to be a few niche Windows features work that most people don’t even know about or would want.




My general point is that discussing the intricacies of potential local AI model usage is way over the head of the people that would even in theory care about the facile “AI PC” marketing message. Since no one is making it trivial for the casual user to actually do anything with those NPUs, then it’s all a moot point for this sort of marketing. Even if there were an enthusiast market that would use those embedded NPUs without a distinct more capable infrastructure, they wouldn’t be swayed/satisfied with just ‘AI PC’ or ‘Copilot+’, they’d want to know specs rather than a boolean yes/no for ‘AI’.