• Prox@lemmy.world
    link
    fedilink
    English
    arrow-up
    142
    arrow-down
    1
    ·
    7 days ago

    Isn’t this true of like everything AI right now?

    We’re in the “grow a locked-in user base” part of their rollout. We’ll hit the “make money” part in a year or two, and then the enshittification machine will kick into high gear.

    • Ghostalmedia@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 days ago

      Yeah, it’s basically like early days of cable, Uber, Instacart, streaming, etc. They have a lot of capital and are running at a loss to capture the market. Once companies have secured a customer base, they start jacking up the prices.

        • Ghostalmedia@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 days ago

          There is a lot of top down shit, but there is definitely bunch non c-suite enterprise customers out there. A lot of product managers are curious about this shit.

        • zerozaku@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 days ago

          There are billions of free users available. All they need to do is strip-off few excellent features of their free model and hide it behind a pay wall annnnd voila these free users have now became their paying customers!

    • woelkchen@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 days ago

      We’re in the “grow a locked-in user base” part of their rollout.

      An attempt at that. It will be partially successful but with AI accelerators coming to more and more consumer hardware, the hurdles of self-hosting get lower and lower.

      I have no clue how to set up an LLM server but installing https://github.com/Acly/krita-ai-tools is easily done with a few mouse clicks. The Krita plugin handles all the background tasks.

    • jaykrown@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      4
      ·
      7 days ago

      I doubt it, LLMs have already become significantly more efficient and powerful in just the last couple months.

      In a year or two we will be able to run something like Gemini 2.5 Pro on a gaming PC which right now requires a server farm.