• haungack@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 days ago

      And what happens when a bubble bursts? Did the internet die when the dotcom bubble burst, or is that just when it really started to get going?

      I share most of your sentiments against AI, but a bubble popping won’t make it go away, and it won’t even rectify it to be more to people’s likings (i doubt it). It takes more than just waiting around to accomplish that.

    • Zexks@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      4
      ·
      5 days ago

      Im so tired of this stupud fucking refrain. Cause we all know how housing got so mich better after 08 and how we dont have any more dot coms and how the internet got so much better since that bubble. You people have no idea what your even asking for.

      • pulsey@feddit.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        5 days ago

        the bubble pops and then everything comes back, in an “improved” version. Imagine: ChatGPT with ads and sponsored answers.

        Just like that one black mirror episode.

      • BarneyPiccolo@lemmy.today
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 days ago

        The problem is that when a bubble pops and exposes the problem, the government leaders should take the opportunity to fix the problem so it doesn’t happen again.

        Instead they bail them all out, so there are not only no consequences to their actions, they are literally rewarded with unimaginable wealth. What about this strategy would induce them to change their ways, over doing it all over again, and getting rewarded again?

    • HugeNerd@lemmy.ca
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      5
      ·
      6 days ago

      Me too. I can then go back to 3D printing quantum blockchains out of room temperature superconductors in my private space station with Katy Perry.

  • Prox@lemmy.world
    link
    fedilink
    English
    arrow-up
    142
    arrow-down
    1
    ·
    7 days ago

    Isn’t this true of like everything AI right now?

    We’re in the “grow a locked-in user base” part of their rollout. We’ll hit the “make money” part in a year or two, and then the enshittification machine will kick into high gear.

    • Ghostalmedia@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 days ago

      Yeah, it’s basically like early days of cable, Uber, Instacart, streaming, etc. They have a lot of capital and are running at a loss to capture the market. Once companies have secured a customer base, they start jacking up the prices.

        • Ghostalmedia@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 days ago

          There is a lot of top down shit, but there is definitely bunch non c-suite enterprise customers out there. A lot of product managers are curious about this shit.

        • zerozaku@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 days ago

          There are billions of free users available. All they need to do is strip-off few excellent features of their free model and hide it behind a pay wall annnnd voila these free users have now became their paying customers!

    • woelkchen@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 days ago

      We’re in the “grow a locked-in user base” part of their rollout.

      An attempt at that. It will be partially successful but with AI accelerators coming to more and more consumer hardware, the hurdles of self-hosting get lower and lower.

      I have no clue how to set up an LLM server but installing https://github.com/Acly/krita-ai-tools is easily done with a few mouse clicks. The Krita plugin handles all the background tasks.

    • jaykrown@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      4
      ·
      7 days ago

      I doubt it, LLMs have already become significantly more efficient and powerful in just the last couple months.

      In a year or two we will be able to run something like Gemini 2.5 Pro on a gaming PC which right now requires a server farm.

  • mojofrododojo@lemmy.world
    link
    fedilink
    English
    arrow-up
    100
    arrow-down
    1
    ·
    7 days ago

    when this bubble pops it’s gonna be horrific.

    google, meta, ms, so many more leveraged out huge investments in datacenters. nvidia is propping up whole segments of the fucking economy.

    https://www.wheresyoured.at/ai-is-a-money-trap/

    it’d be fun to watch if I could isolate myself from the chaos that will ensue, but we’re all gonna get fucked by the aibros, it’s only a question of which segment of the economy blows up first.

    • Thorry84@feddit.nl
      link
      fedilink
      English
      arrow-up
      42
      ·
      edit-2
      7 days ago

      There is another factor in this which often gets overlooked. A LOT of the money invested right now is for the Nvidia chips and products based around them. As many gamers are painfully aware, these chips devalue very quickly. With the progress of technology moving so fast, what was once a top of the line unit gets outclassed by mid tier hardware within a couple of years. After 5 years it’s usefulness is severely diminished and after 10 years it is hardly worth the energy to run them.

      This means the window for return on investment is a lot shorter than usual in tech. For example when creating a software service, there would be an upfront investment for buying the startup that created the software. Then some scaling investment in infrastructure and such. But after that it turns into a steady state where the input of money is a lot lower than revenue from the customer base that was grown. This allows to get returns on investment for many years after that initial investment and growth phase.

      With this Ai shit it works a bit different. If you want to train and run the latest models in order to remain competitive in the market, you would need to continually buy the latest hardware from Nvidia. As soon as you start running on older hardware, your product would be left behind and with all the competition out there users would be lost very quickly. It’s very hard to see how the trillions of dollars invested now are ever going to be recovered within the span of five years. Especially in a time where so much companies are dumping their products for very low prices and sometimes even for free.

      This bubble has to burst and it is going to be bad. For the people who were around when the dotcom bubble burst, this is going to be much worse than that ever was.

      • mojofrododojo@lemmy.world
        link
        fedilink
        English
        arrow-up
        28
        ·
        edit-2
        7 days ago

        yeah datacenters never really aged well, and making them gpu dependent is going mean they age like hot piss. and since they’re ai-dedicated gpus, they can’t even resell them lol.

        all this investment, for what? so some chud can have a picture of taylor swift with 4 tits?

        fucking idiots

        • howrar@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          7 days ago

          I don’t see why they can’t be resold. As long as there’s a market for new AI hardware, there will continue to be a market for the older stuff. You don’t need the latest and greatest for development purposes, or things that scale horizontally.

          • mojofrododojo@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            7 days ago

            I didn’t say they couldn’t be resold, they simply won’t have as wide a potential user market like an generic GPU would. But think about it for a sec, you’ve got thousands of AI dedicated gpu’s going stale whenever a datacenter gets overhauled or a datacenter goes bust.

            that’s gonna put a lot more product on the market that other datacenters aren’t going to touch - no one puts used hardware in their racks - so who’s gonna gobble up all this stuff?

            not the gamers. who else needs this kind of stuff?

            • jkercher@programming.dev
              link
              fedilink
              English
              arrow-up
              7
              ·
              7 days ago

              Also depends how hard the AI runs them. A good chunk of the graphics cards that were used as miners came out on life support if not completely toasted. Games generally don’t run the piss out of them like that 24/7, and many games are still CPU bound.

            • Passerby6497@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              edit-2
              6 days ago

              no one important puts used hardware in their racks

              FTFY. Just about every msp I’ve worked for has cut corners and went with 2nd hand (or possibly grey market) hardware to save a buck, including the ones who colo in “real” data centers. I would not be surprised to find that we’re onboarding these kinds of cards to make a bespoke AI platform for our software customers here in a few years.

            • addie@feddit.uk
              link
              fedilink
              English
              arrow-up
              4
              ·
              6 days ago

              I’m not sure that they’re even going to be useful for gamers. Datacenter GPUs require a substantial external cooling solution to stop them from just melting. Believe NVidia’s new stuff is liquid-only, so even if you’ve got an HVAC next to your l33t gaming PC, that won’t be sufficient.

              • mojofrododojo@lemmy.world
                link
                fedilink
                English
                arrow-up
                8
                ·
                6 days ago

                not just those constraints, good luck getting a fucking video signal out of 'em when they literally don’t have hdmi/dp or any other connectors.

      • panda_abyss@lemmy.ca
        link
        fedilink
        English
        arrow-up
        11
        ·
        7 days ago

        They’ll write this off as a loss and offset their corporate taxes

        Also china is a great example that you do not need all the latest hardware, but it does help

    • skisnow@lemmy.ca
      link
      fedilink
      English
      arrow-up
      25
      ·
      7 days ago

      A lot of startups whose entire business model relies on OpenAI’s small model API calls costing under $1/Mtok, are going to go bust when OpenAI finally runs out of money and ramps the cost up tenfold.

        • pfizer_dose@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          ·
          6 days ago

          Yep it’s blitzscaling. Run it at a loss until it’s a necessity, then charge whatever the hell you want. They’re blitzscaling our right to intellectual property and our right to work.

      • Tollana1234567@lemmy.today
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 days ago

        ive seen a ton of billboards of startup AI comp in west coast, i assume every new one that appears on these billboards, the old ones go under.

      • Mika@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 days ago

        It would be just cheaper to self-host something for the whole company then? Open-source AIs are there and they are very much competitive with proprietary solutions.

        • skisnow@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          6 days ago

          If you want OpenAI level response times you might be surprised how expensive self-hosting gets.

    • vector42@programming.dev
      link
      fedilink
      English
      arrow-up
      7
      ·
      7 days ago

      Came here to see if someone had mentioned Ed Zitron’s blog. His last two pieces on the AI bubble are fantastic reads.

      • mojofrododojo@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        7 days ago

        yeah secondary knockon effects - once nvidia realizes it’s not going to actually sell 5 gpus per human being, the datacenters for them evaporate, then the power production to feed those datacenters becomes pointless…

        an effective administration would mandate all renewable energy for this purpose, so when it implodes they could at least derive a benefit from the expanded production… but no, trump will have them build coal plants for it all. or like grok, methane powered generators fml

          • mojofrododojo@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            6 days ago

            bringing old reactors online may end up an overall positive (say, if the ai bubble pops soon but the reactors still come online and displace fossil sources) but I’m dubious about smr’s still. it just seems like more chances for radionucleotides to get smeared everywhere if they become ubiquitous.

    • BetaBlake@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      7 days ago

      Hopefully sooner rather than later, and maybe Elon can stop poisoning a neighborhood in Memphis with Grok

      • mojofrododojo@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 days ago

        yeah that’s one of the more egregious examples, basically a methane factory that eats prodigious amounts of water and power, all in process of giving us MECHAHITLER.

        what’s not to love?

  • nutsack@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    11
    ·
    5 days ago

    the day that these guys need to turn a profit will be the day that a lot of people lose access to this sort of thing

  • buddascrayon@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    1
    ·
    6 days ago

    Yes, this is part of the business model. The goal is to get everyone addicted to their service, then jack the price up to profitable margins. It’s the same model Netflix and Amazon used. Bothe services lost money for over 10 years before becoming profitable.

      • buddascrayon@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        6 days ago

        It is not venture capitalism. Though it is fueled by venture capitalism. I am describing the type of car and you are calling it gasoline. They’re most distinctly not the same thing.

        However it should be noted there both a part of the same corrosion of our society. Just how automobiles that run on gasoline are a corrosion on our atmosphere.

      • Saledovil@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        ·
        6 days ago

        Venture capitalism is when you give somebody money to start a business in hopes that they make it big, giving you really valuable equity for relatively little money. What you’re thinking of is blitzscaling. Scale up in an unsustainable way in order to gain market dominance, so that you can use that to become profitable.

  • TheObviousSolution@lemmy.ca
    link
    fedilink
    English
    arrow-up
    20
    ·
    6 days ago

    In other words, they want to hook up users and companies, make them dependent, and then rise up the prices severely while finding ways to process and incorporate all of the data they’ve gathered in ways that will probably involve automating the jobs of the users themselves.

  • Chaotic Entropy@feddit.uk
    link
    fedilink
    English
    arrow-up
    18
    ·
    6 days ago

    Basically, the only reason some of these vaguely functional AI tools actually work okay is because they haven’t been ruined with inevitable monetisation yet.

    • tempest@lemmy.ca
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      5 days ago

      Already the cost is quite high. A prolific dev can easily burn 100usd a day in tokens and they have not even started to enshitify.

      Some of the cost to run these models will come down a bit if Nvidia gets some actual competition which I’m sure will happen in the medium to long term because the hyper scalers definitely don’t like paying Nvidia’s AI ransom and the Chinese don’t want to be beholden to a company the US can influence.

      We will see which happens first.

  • OctopusNemeses@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    1
    ·
    edit-2
    6 days ago

    Isn’t this just the tech industry. Run at a loss. Eat VC money. Wait. Wait.

    Some how you become normalized and suddenly important Next thing you know you’re raking profit.

    Like the guy that has no friends who nobody really likes. He won’t go away. He just sticks around. Nobody ever told him to fuck off. So he’s just part of the group.

      • rozodru@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        6 days ago

        and then you go on linkedin and all the middle manager tech bros will hail it as the second coming.

    • hark@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      Yep, and they were helped a lot after the 2008 financial crisis when interest rates were dropped super low and loans were cheap. That’s a major reason why the market has been screaming for the fed to cut the interest rate as much as possible.

  • elgordino@fedia.io
    link
    fedilink
    arrow-up
    32
    ·
    7 days ago

    This is the thing I don’t understand about businesses like Cursor. They take two other companies products (Claude and VS Code) and smash them together and sell the result at a loss. How is that much of a business when basically what you’ve got is something that could have been a VsCode plugin.

    • Balder@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      7 days ago

      Yeah, this has been reported on multiple analysis over time. Until something in the hardware space changes, it’s gonna be an unprofitable business.

        • bridgeenjoyer@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          7 days ago

          I have an idea guys what about like a big cloud we can put all our data in ?? Guys CLOUD. ITS SO GENIUS. CLOUD.

          This ai shit is the exact same thing (also it isn’t ai. And I hate that we keep calling it that. )

          • takeda@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            1
            ·
            7 days ago

            It is a chat bot. It is a good compared to the chart bots we had in the past, but still a chat bot.

            The reason it can be used for programming is because programming language is still a language.

            They only seen to forget that programming languages were already invented to be a bridge between human and a machine language, but being able to do software engineering is much more than just knowing programming languages.

            This makes me wonder, anyone knows how good are those tools at creating assembly code? I don’t program in assembly, but I know that the “language” is very simple, but you actually need to be constantly aware of the state of the system.

            • HakFoo@lemmy.sdf.org
              link
              fedilink
              English
              arrow-up
              4
              ·
              7 days ago

              I’d suspect the low “density” of context makes it prone to hallucinations. You need to load in 3000 lines to express what Python does in 3, so there’s a lot of chances to guess the next token wtong.

              • Balder@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                6 days ago

                I was gonna say that, probably the higher the abstraction level the best it is for LLMs to reason about the code, because once learned it’s less tokens.

  • yarr@feddit.nl
    link
    fedilink
    English
    arrow-up
    16
    ·
    6 days ago

    So much of the AI stuff we see today are boards reacting and worrying about being “left behind” in AI. In many cases, the goal is not to deliver value. The goal is to be able to attach a little sticker that says “AI” to their products to excite the shareholders.

    Unfortunately in this case, some of the largest companies in the world haven’t been able to figure out how to run AI services at a profit.

    This could change any day if some more efficient hardware arrives, but until then, most of the software world is just crossing their fingers it becomes profitable one day while they light dollar bills on fire in their datacenters.

    If this isn’t “bubbleish” behavior I don’t know what is.

    • null_dot@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 days ago

      I was in a local bike store looking at red tail lights yesterday.

      One brand Lezyne had several versions. There was an “AI Alert” one. I looked it up and it just has a sensor to detect when you brake and it changes to a different flashing mode at that time.

      Thats barely even “smart” let alone “AI”.

      The stupid thing is, because of this dumb claim they needed to confirm that it doesn’t collect and transmit any data about your riding habits. Its a light with no connectivity other than a charging port.

      The dumbfuckery is astonishing.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    4
    ·
    7 days ago

    I’m skeptical of AI coding as it exists today, and while I’m bullish on long-term prospects for AI writing software, am very dubious that simply using LLMs is going to be the answer.

    However.

    Startups typically do lose money. They’ll burn money as they acquire a userbase — their growth phase — and transition to profitability later. I don’t think “startups in area X tend to be losing money” is terribly surprising.

    • rozodru@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      6 days ago

      you’re VERY justified in feeling skeptical, I’m seeing it first hand, you’re correct.

      I’m a consultant/freelancer and I’m booked for the rest of the year and well into the new year with jobs that pretty much consist of me reviewing and cleaning up AI slop.

      Most of my clients are startups and small companies that went full in on AI and vibe coding. Now they’re discovering that their attempts to save a few bucks by leveraging AI, cutting devs, etc is costing them more that what they envisioned on saving. The stuff they’ve built with AI doesn’t scale, is full of exploits, and breaks quickly. With the recent Tea App thing many of my clients are now in a panic because they essentially did the exact same thing. They don’t want their startup to be next in the news because some rando came across their house with the front door left open by AI.

      the tech debt is massive, It’s costing many of these places more to fix their vibe coders/AI mistakes than what it would have originally cost if they just used a solid dev team. Make no mistake, I’m charging them a good amount also.

      All if it could have been avoided though. They could have continued to use their LLM’s if they had all just kept a leash on it. if they dismissed the concept of vibe coding. A good chunk of it could have been avoided if the person feeding the prompts simply REVIEWED the code before hitting enter. I’m not kidding, IF they just LOOKED at what was being spat out things would be different. none of them did. they just trusted the AI to be smarter because they were lead to believe it was.

    • Dogiedog64@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 days ago

      Ok, but it isn’t just startups burning money here like there’s no tomorrow - it’s also major industry leaders (Microsoft, Facebook, Apple, Google, Nvidia, etc.) dumping hundreds of billions into infrastructure and development of a tech that has, so far, shown 0 positive returns for anyone and everyone. Everyone involved is pouring in money like it’s going out of style, largely because they see this as a potential pathway to infinite profits down the line, just as long as THEY are the ones to get there first; consequences be damned. WHEN this bubble pops, not IF, it’ll be messy. Extremely messy.

  • Bogasse@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    5 days ago

    I was gonna make a sarcastic comment on how surprised I was that a 5$ subscription is not enough for something so heavy that it requires building new nuclear plants.

    But holly shit, ChatGPT+ is 23€/months.