A new survey conducted by the U.S. Census Bureau and reported on by Apolloseems to show that large companies may be tapping the brakes on AI. Large companies (defined as having more than 250 employees) have reduced their AI usage, according to the data (click to expand the Tweet below). The slowdown started in June, when it was at roughly 13.5%, slipping to about 12% at the end of August. Most other lines, representing companies with fewer employees, are also at a decline, with some still increasing.

  • rumba@lemmy.zip
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 day ago

    It’ll right itself when the CEOs stop investing in it and force it on their own companies.

    When they’re not getting their returns, they’ll sell their stocks and stop paying for it.

    It’ll eventually go back from slop generation to correction and light editing tools when venture stops paying for the hardware to run tokens and they have to pay to replace the cards. .

  • jubilationtcornpone@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    50
    ·
    2 days ago

    Personal Anecdote

    Last week I used the AI coding assistant within JetBrains DataGrip to build a fairly complex PostgreSQL function.

    It put together a very well organized, easily readable function, complete with explanatory comments, that failed to execute because it was absolutely littered with errors.

    I don’t think it saved me any time but it did help remove my brain block by reorganizing my logic and forcing me to think through it from a different perspective. Then again, I could have accomplished the same thing by knocking off work for the day and going to the driving range.

    • August27th@lemmy.ca
      link
      fedilink
      English
      arrow-up
      42
      ·
      2 days ago

      Then again, I could have accomplished the same thing by knocking off work for the day and going to the driving range.

      Hey, look at the bright side, as long as you were chained to your desk instead, that’s all that matters.

    • Cethin@lemmy.zip
      link
      fedilink
      English
      arrow-up
      16
      ·
      2 days ago

      At one point I tried to use a local model to generate something for me. It was full of errors, but after some searching online to look for a library or existing examples I found a github repo that was almost an exact copy of what it generated. The comments were the same, and the code was mostly the same, except this version wasn’t fucked up.

      It turns out text prediction isn’t that great at understanding the logic of code. It’s only good at copying existing code, but it doesn’t understand why it works, so the predictive model fucks things up when it takes the less likely result. Maybe if you turn the temperature to only give the highest prediction it wouldn’t be horrible, but you might as well just search online and copy the code that it’s going to generate anyway.

    • UncleMagpie@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      2 days ago

      The bigger problem is that your skills are weakened a bit every time you use an assistant to write code.

      • KneeTitts@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        2 days ago

        The bigger problem is that your skills are weakened a bit every time you use an assistant to write code

        Not when you factor in that you are now doing code review for it and fixing all its mistakes…

      • floofloof@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        2 days ago

        It depends how you’re using it. I use it for boilerplate code, for stubbing out classes and functions where I can tell it clearly what I want, for finding inconsistencies I might have missed, to advise me on possible tools and approaches for small things, and as a supplement to the documentation when I can’t find what I’m looking for. I don’t use it for architecting new things, writing complex and specialized code, or as a replacement for documentation. I feel like I have it fairly well contained to what it does well, so I don’t waste my time on what it does badly, and it isn’t really eating away at my coding brain because I still do the tricky bits myself.

      • Honytawk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        7
        ·
        2 days ago

        That is just dumb.

        Your skills are weakened even more by copying code from someone else. Because you have the use even less of your brain to complete your task.

        Yet you people don’t complain about that part at all and do it yourself all the time. For some it is even the preferred method of work.

        “Using your skills less means they get weaker, who would have thought!”

        With your logic, you shouldn’t use any form of help to code. Programmers should just lock themselves in a big black box until their project is finished, that will make sure their skills aren’t “weakened” by using outside help.

  • eronth@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    17
    ·
    2 days ago

    Kind of a weird title. Of course adoption would slow? The people who want it have adopted it, the people who don’t haven’t.

    • KneeTitts@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      2 days ago

      We were initially excited by AI at my company, but after we used it a bit we didnt find any really meaningful use cases for it in our business model. And in most cases we spent a lot of time correcting its many errors which would actually slow down our processes…

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      Marx tapping the big sign marked “Tendency of the rate of profit is to fall”, but then looking at the already unprofitable AI spin-offs and just throwing his hands up in disgust.

      I think there’s an argument to be made that the AI hype got a bunch of early adopters, but failed to entice more traditional mainstream clients. But the idea that we just ran out of new AI users in… barely two years? No. Nobody is really paying for this shit in a meaningful way. Not at the Enterprise Application scale of subscriptions. That’s why Microsoft is consistently losing money (on the scale of billions) on its OpenAI investment.

      If people were adopting AI like they’d adopted the latest Windows OS, these firms would be seeing a steady growth in the pool of users that would signal profitability soon (if not already). But the estimates they’re throwing out - one billion AI adoptions in barely a year - are entirely predicated on how many people just kinda popped in, looked at the web interface, and lost interest.

    • _haha_oh_wow_@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      2 days ago

      It would also slow if companies were told insane lies about the capability of “AI” (“it’s living having a team of PHD level experts at your disposal!”) and then companies realized that many of these promises were total bullshit.

  • Pat_Riot@lemmy.today
    link
    fedilink
    English
    arrow-up
    18
    ·
    2 days ago

    They dressed up a parrot and called it the golden goose and now they’re chasing a wild goose.

  • RedGreenBlue@lemmy.zip
    link
    fedilink
    English
    arrow-up
    18
    ·
    2 days ago

    For the things AI is good at, like reading documentation, one should just get a local model and be done.

    I think pouring as much money as big companies in the us has been doing is unwise. But when you have deep pockets, i guess you can afford to gamble.

      • FauxLiving@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 days ago

        I’m using Deepseek R1 (8B) and Gemma 3 (12B), installed using LM Studio (which pulls directly from Hugging Face).

      • null_dot@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        I dont have the hardware so I’m using “open web ui” to run queries on models accessible via huggingface API.

        Works really well. I haven’t invested the time to understand how to use workspaces, which allow you to tune models, but aparently its doable.

      • Cethin@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        As the other comment says, LM Studio is probably the easiest tool. Once you’ve got it installed it’s trivial to add new models. Try some out and see what works best for you. Your hardware will be a limit on what you can run though, so keep that in mind.

    • KneeTitts@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      8
      ·
      2 days ago

      finally. Maybe the hype wave has crested

      Well one thing I can tell you is that art is gone, forever. They took that from us and our kids and all generations to come.

      • kazerniel@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        52 minutes ago

        I don’t think that’s the case, anyone can still make art. Though it’s true, it’s even harder to make a living from art now than it already was.

  • jaykrown@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    4
    ·
    2 days ago

    It is absolutely a bubble, but the applications that AI can be used for still remain while the models continue to get better and cheaper. Here’s the actual graph:

  • underline960@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    4
    ·
    2 days ago

    13.5%, slipping to about 12%

    I know that 1.5% could mean hundreds of businesses, but this still seems like such a nothing burger.

    • sexy_peach@feddit.org
      link
      fedilink
      English
      arrow-up
      31
      ·
      2 days ago

      The ai companies haven’t even found a viable business model yet, are bleeding money while the user base is shrinking

      • shalafi@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        2 days ago

        The lack of business model is what’s freaking me out.

        Around 2003 I was talking to a customer about Google going public and saying he should go all in.

        “Meh, they’re a great search engine, but I can’t see how they’ll make any money.”

        Still remember that conversation, standing in his attic, wiring his new satellite dish. Wonder if he remembers that conversation at well.

        • setsubyou@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          What gets me is that even the traditional business models for LLMs are not great. Like translation, grammar checking, etc. Those existed before the boom really started. DeepL has been around for almost a decade and their services are working reasonably well and they’re still not profitable.

      • underline960@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        Isn’t that the case with a lot of modern tech?

        I vaguely recall Spotify and Uber being criticized relying on the “get big first and figure out how to monetize later” model.

        (Not defending them, just wondering what’s different about AI.)

        • khornechips@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 days ago

          Spotify is a music streaming service with subscription fees generating recurring revenue, it would be fine in a world without an investor class obsessed with infinite growth. Uber is to taxis what crypto is to banks, essentially exploiting a gap in regulations to undercut an existing market.

          “AI” is a solution desperately looking for a problem to justify all the money and resources being wasted on it.

          • underline960@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            edit-2
            2 days ago

            What are you talking about? ChatGPT, Claude, Gemini, etc. all have “subscription fees generating recurring revenue” and are famously “exploiting a gap in regulations to undercut an existing market.”

            Uber took 15 years to become profitable, and Spotify took 18 years.

            Again, I’m not defending any of them (they all exploit the people who make their service work), but so far AI seems to be going down the same road.

            • khornechips@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              4
              ·
              2 days ago

              Spotify provides a real, tangible service. I pay for access to music I get access to music.

              What service does an LLM actually provide? They can’t be relied on for accurate information, they can’t reason, the only thing they seem to be able to do is psychologically manipulate their users. That makes money now, but in six months? A year? We’re already seeing usage fall despite some of the wealthiest companies on the planet burning unfathomable amounts of money.

              • underline960@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                3
                ·
                edit-2
                2 days ago

                “I pay for access to music I get access to music.” And with ChatGPT, you pay for access to an LLM, and you get access to an LLM.

                Just because you personally don’t value that as a service doesn’t inherently invalidate it as a business model, now or in the future.

                Netflix lost subscribers in 2011 and 2022, that didn’t kill the company. Uber stock tumbled during the pandemic and again in 2022. In 2023, Wired was writing about how “despite its popularity… [Spotify] has long struggled to turn consistent profits.”

                This is a whole wave of companies where the survivors seem financially stable now, but had a long history of being propped up by venture capital and having an unclear path to profitability.

                The only thing you’ve successfully shown is different so far is that you don’t think it’s a real service.

                I generally agree, but I still don’t see anything that differentiates its trajectory from the Spotifys, Ubers, and Netflixes of the world.

    • Saleh@feddit.org
      link
      fedilink
      English
      arrow-up
      18
      ·
      2 days ago

      That is more than a 10% loss of that customer base in 2 month.

      For any industry that is huge.

    • CommanderCloon@lemmy.ml
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 days ago

      But they’re already not making money, losing customers during the supposed growth phase is absolutely devastating. It’s occuring all while AI is being subsidized by massive investments from the likes of microsoft and google, and many more namelesss VCs through OpenAI, anthropic etc.

  • SunSunFuego@lemmy.ml
    link
    fedilink
    English
    arrow-up
    13
    ·
    2 days ago

    let’s not forget the us is pumping EVERYTHING into ai, 3-4% of the gdp are just the ai economy. here’s hoping it comes crashing down on them

  • mechoman444@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 days ago

    Of course. Although ai, or more accurately llms do have use functions they are not the star trek computer.

    I use chatgpt as a Grammer check all the time. It’s great for stuff like that. But it’s definitely not a end all be all solution to productivity.

    I think corporations got excited llms could replace human labor… But it can’t.

    • Typhoon@lemmy.ca
      link
      fedilink
      English
      arrow-up
      17
      ·
      2 days ago

      Grammer

      Grammar.

      There’s nothing AI can do that an internet pedant can’t.

      • mechoman444@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2 days ago

        No. Its grammer. No one says grammAR everyone says it with er. It’s spelled grammar due to tradition and nothing else. Same reason the ph is still prevelant in the English language.

        Ehhhhhh the English language is terrible!

        • Giblet2708@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          Sure, English is terrible. Don’t forget dollar, pillar, cougar, burglar, doctor, actor, or aviator. Yet, oddly enough, somehow most people deal with them, and life goes on.

          Go read about The Great Vowel Shift; it’s pretty informative.