• kbal@fedia.io
    link
    fedilink
    arrow-up
    35
    arrow-down
    1
    ·
    3 months ago

    Recording and analyzing all the real-time video and audio feeds of their surroundings that everyone is required to provide while using the Internet, to ensure that no children are present when they use social media.

  • ElectricFire@sh.itjust.works
    link
    fedilink
    arrow-up
    14
    ·
    3 months ago

    I think hardware as a service will be their next thing, raise the cost of parts so people buy a cheap sub then increase the aubscription year by year.

  • jordanlund@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    3 months ago

    The last boom/bust cycle resulted in a lot of high tech gear getting sold off at bargain basement prices.

    Good for new businesses but bad for companies like Cisco trying to sell new gear in a market flooded with cheap used gear.

  • SoftestSapphic@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    3 months ago

    Lol when the AI bubble pops it will be most likely destroyed to maintain artificially high hardware costs.

    At least that’s why China ending crypto mining didn’t drastically reduce the price of graphics cards.

  • Perspectivist@feddit.uk
    link
    fedilink
    arrow-up
    11
    arrow-down
    4
    ·
    3 months ago

    ChatGPT alone has 800 million weekly users of which the vast majority are normal people - not companies. The demand is there despite it not being able to increase company profit margins the way people expected. I don’t see this computing infrastructure needing to run idle anytime soon.

    • Varyk@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      17
      ·
      edit-2
      3 months ago

      Chatgpt is constantly losing money, public surface-level interest won’t matter much when the capital runs out and they’re still accruing significant debt without any revenue.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        3 months ago

        A major problem faced by first-mover companies like OpenAI is that they spend an enormous amount of money on basic research and initial marketing and hardware purchases to set up in the first place. Those expenses become debts and have to be paid off by the business later. If they were to go bankrupt and sell off ChatGPT to some other company for pennies on the dollar that new owner would be in a much better position to be profitable.

        There is clearly an enormous demand for AI services, despite all the “nobody wants this” griping you may hear in social media bubbles. That dermand’s not going to disappear and the AIs themselves won’t disappear. It’s just a matter of finding the right price to balance things out.

        • Varyk@sh.itjust.worksOP
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          3 months ago

          Nope, you’ll certainly need a source to back that speculation up.

          Half a billion people are “using” AI and the total llm market cap is a few billion. On average, users may be willing to pay up to 50 cents a month for inaccurate word association.

          Not even a drop in the buckets companies need to fill up with everything they’re spending just on advertising, not to mention infrastructure, utility and upgrade costs.

          People are statistically not willing to sustainably pay for llms, even if we assumed the rosy predictions of 20x LLM market caps in a decade.

          Devil’s advocate: Increased AI cash flow could occur if people don’t realize their ai “search results” are paid advertisements, and considering longstanding obliviousness to directed advertising and the recent abolishment of US consumer rights…it could happen.

            • Varyk@sh.itjust.worksOP
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              3 months ago

              Yes; the included source and explanatory paragraph above in the same comment you are referencing.

              Would you care to provide any evidence for your speculation that people are willing to pay enough for AI to sustain its costs?

              • Perspectivist@feddit.uk
                link
                fedilink
                arrow-up
                1
                ·
                3 months ago

                Would you care to provide any evidence for your speculation that people are willing to pay enough for AI to sustain its costs?

                ChatGPT alone has 800 million weekly users and their total revenue in 2025 was 13 billion with 70% coming from normal users. That’s drop in the bucket though considering they’ve commited to investing a trillion dollars into new computing capacity over the next 10 years.

                • Varyk@sh.itjust.worksOP
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  3 months ago

                  This is why you should provide a source, your numbers and associated assumptions are incorrect:

                  Chatgpt has estimated revenue of 1.3 billion, not 13 billion, neither of which are remotely significant as revenue streams relative to cost.

                  That’s the thrust of my opening paragraph, and then you appear to have taken up my drop in the bucket analogy, so i guess we’re on the same page now.

    • ch00f@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      3 months ago

      I think OP is talking about all of the future data centers that are allegedly being build despite nobody even knowing where. Nvidia has agreed to pay OpenAI $10B per gigawatt of datacenter for 10 gigawatts of datacenter build up over the next few years.

      Unlikely that will fully materialize, but that’s the current outlook.

      • mrmacduggan@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        They’re trying to build several right now near my home in Southeast Michigan. So now you know where.

      • CheeseNoodle@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Theres a proposal for one in Scotland that will use up as much electricity as the whole country combined (Except during winter where peak load is a whopping 1/3 higher than the proposed data centre.

    • Melobol@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      3 months ago

      The free plan of chatgtp is more than enough for most people. And when they decide to start charging for it, probably 30% of free users will switch to a different (mahbe even locally run) Ai.

    • neidu3@sh.itjust.worksM
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 months ago

      With the direction Microsoft has taken windows the past decade or so, a GPU farm in the back room will be needed to run windows 12. Maybe that’s why everyone needs a Microsoft account to use windows these days - Microsoft is planning ahead.

    • SpacetimeMachine@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      3 months ago

      That’s not really accurate at all. A GPU running at full load might wear out its fan, but if it’s kept at a consistent temperature it’s not going to shorten the life by much. The stress on a GPU generally comes from either being over bolted or from the thermal expansion and shrinkage from an inconsistent temp.

    • Varyk@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      3
      ·
      3 months ago

      This is kind of what I’m wondering about. Countless warehouses full of used half functional video cards.

  • moondoggie@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    3 months ago

    Companies will never admit it. They’ll drive this shit right into the ground and keep digging

  • brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    3 months ago

    I hope some of it hits the used market, so tinkerers can play with them.

    But yeah, knowing them, they will probably just throw the hardware away :(

  • Rhoeri@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    3 months ago

    Hilarious that you think they’re going to admit it. They’ll get bailed out and star over again with a brand new pitch.

  • IronBird@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    3 months ago

    it’ll flood the 2nd market, anything that can’t be flipped past a certain point will be scrapped for recycling or just thrown away