The US dictionary Merriam-Webster’s word of the year for 2025 was “slop”, which it defines as “digital content of low quality that is produced, usually in quantity, by means of artificial intelligence”. The choice underlined the fact that while AI is being widely embraced, not least by corporate bosses keen to cut payroll costs, its downsides are also becoming obvious. In 2026, a reckoning with reality for AI represents a growing economic risk.

Ed Zitron, the foul-mouthed figurehead of AI scepticism, argues pretty convincingly that, as things stand, the “unit economics” of the entire industry – the cost of servicing the requests of a single customer against the price companies are able to charge them – just don’t add up. In typically colourful language, he calls them “dogshit”.

Revenues from AI are rising rapidly as more paying clients sign up but so far not by enough to cover the wild levels of investment under way: $400bn (£297bn) in 2025, with much more forecast in the next 12 months.

Another vehement sceptic, Cory Doctorow, argues: “These companies are not profitable. They can’t be profitable. They keep the lights on by soaking up hundreds of billions of dollars in other people’s money and then lighting it on fire.”

  • Kaiserschmarrn@feddit.org
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    4 days ago

    I am a software engineer and nearly every time when I used a model for something, it made shit up that didn’t work this way or didn’t even exist. I always ended up reading documentation and fixing the problem myself.

    The only thing where AI is somewhat decent in the context of software development is Code Completion. JetBrains’ models are doing an ok job in that regard.

    • Tar_Alcaran@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 days ago

      Oh, you wanted code that actually works with your existing codebase? Sorry, here’s some additional sludge sprinkled on top of the previous crappy answer.