• Suavevillain@lemmy.world
    link
    fedilink
    English
    arrow-up
    94
    arrow-down
    2
    ·
    edit-2
    5 天前

    AI has taken more things since it’s big push to be adopted in the public sector.

    Clean Air

    Water

    Fair electricity bills

    Ram

    GPUs

    SSDs

    Jobs

    Other people’s art and writing.

    There are no benefit to this stuff. It is just grifting.

  • Randelung@lemmy.world
    link
    fedilink
    English
    arrow-up
    102
    arrow-down
    1
    ·
    5 天前

    This bubble is going to become the entire market, isn’t it. Until it becomes too big to fail because 80% of the workforce is tied up in it. Then it is allowed to pop, costing the western world everything, all going into the pockets of the super rich, and we get to start over.

      • Khrux@ttrpg.network
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        15
        ·
        5 天前

        Compared to crypto and NFTs, there is at least something in this mix, not that I could identify it.

        I’ve become increasingly comfortable with LLM usage, to the point that myself from last year would hate me. Compared to projects I used to do with where I’d be deep into Google Reddit and Wikipedia, ChatGPT gives me pretty good answers much more quickly, and far more tailored to my needs.

        I’m getting into home labs, and currently everything I have runs on ass old laptops and phones, but I do daydream if the day where I can run an ethically and sustainably trained, LLM myself that compares to current GPT-5 because as much as I hate to say it, it’s really useful to my life to have a sometimes incorrect but overalls knowledgeable voice that’s perpetually ready to support me.

        The irony is that I’ll never build a server that can run a local LLM due to the price hikes caused by the technology in the first place.

        • raspberriesareyummy@lemmy.world
          link
          fedilink
          English
          arrow-up
          33
          arrow-down
          4
          ·
          5 天前

          I’ve become increasingly comfortable with LLM usage, to the point that myself from last year would hate me. Compared to projects I used to do with where I’d be deep into Google Reddit and Wikipedia, ChatGPT gives me pretty good answers much more quickly, and far more tailored to my needs.

          Please hate yourself, reflect on that and walk back from contributing to destroying the environment by furthering widespread adoption of this shitty technology. The only reason you seem to get “useful answers” is because of search engine and website enshittification. What you are getting is still tons worse than a good web research 10 years ago.

          Basically you were taught to enjoy rancid butter because all restaurants around you had started tasting like shit first, then someone opened a rancid butter shop.

          • Khrux@ttrpg.network
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            6
            ·
            5 天前

            I do agree entirely. If I could use the internet of 2015 I would, but I can’t do so in a practical way that isn’t much more tedious than asking an LLM.

            My options are the least rancid butter of the rancid butter restaurants or I churn my own. I’d love to churn my own and daydream of it, but I am busy, and can barely manage to die on every other hill I’ve chosen.

            • dil@lemmy.zip
              link
              fedilink
              English
              arrow-up
              6
              ·
              5 天前

              web search isnt magically going back to how it was, and its not just search engines its every mf trying tk take advantage of seo and push their content to the top, search is going to get worse evry year, ai did speed it up by making a bunch of ai images pop up whenever you search an image

            • raspberriesareyummy@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              4 天前

              problem is that the widespread use of (and thereby provision of your data to) LLMs contributes to the rise of totalitarian regimes, wage-slavery and destroying our planet’s ecosystem. Not a single problem in any of our lives is important enough to justify this. And convenience because we are too lazy to think for ourselves, or to do some longer (more effort) web research, is definitely not a good excuse to be complicit in murder, torture and ecoterrorism.

    • Khrux@ttrpg.network
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      1
      ·
      5 天前

      I heard a theory (that I don’t believe, but still) that Deepseek is only competitive to lock the USA into a false AI race.

    • Ensign_Crab@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      edit-2
      5 天前

      Then it is allowed to pop, costing the western world everything, all going into the pockets of the super rich, and we get to start over.

      After the bailouts at the expense of the poor, of course.

    • humanspiral@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      5 天前

      it becomes too big to fail because 80% of the workforce is tied up in it

      In 2008, banking sector and auto industry needed bailouts for the investor/financial class. Certainly, there was no need to layoff core banking employees, if government support was the last resort to keep the doors open AND gain controlling stake over future banking profitablity in a hopefully sustainable (low risk in addition to low climate/global destruction) fashion. The auto bailout did have harsher terms than the banking bailout, and recessions definitely harm the sector, but the bailouts were definitely focused on the executives/shareholders who have access to political friendships that result in gifts instead of truly needed lifelines, or wider redistribution of benefits from sustainable business.

      The point, is that workforce is a “talking point” with no actual relevance in bailouts/too big to fail. That entire stock market wealth is concentrated in the sector, and that we have to all give them the rest of our money (and militarist backed surveillance freedom) or “China will win” at the only sector we pretend to have a competitive chance in, is why our establishment needs another “too big to fail moment”. We’ve started QE ahead of the crash this time.

      Work force is relatively small in AI sector. Big construction, but relatively low operations employment. It displaces other hiring too.

  • etchinghillside@reddthat.com
    link
    fedilink
    English
    arrow-up
    113
    arrow-down
    5
    ·
    5 天前

    My mind forgot that M.2 is probably more prevalent these days and that they’re not just shutting down for no reason.

    • Hubi@feddit.org
      link
      fedilink
      English
      arrow-up
      45
      ·
      5 天前

      Is it though? Pretty much every single current-gen mainboard still comes with a number of SATA ports.

      • RamRabbit@lemmy.world
        link
        fedilink
        English
        arrow-up
        49
        ·
        edit-2
        5 天前

        Everyone is going to buy M.2 SSDs first, and only buy SATA if they don’t have enough M.2 slots. I really doubt SATA SSDs are selling well.

        With that said, I don’t see SATA going anywhere. It’s (comparatively low) bandwidth means you can throw a few ports on your board and not sacrifice much. For some quick math: a M.2 port back-hauled by PCIe 4.0 x4 has 7.8 GB/s of data lines going to it. While SATA 6.0 has only 0.75 GB/s of data lines going to it.

        • tburkhol@lemmy.world
          link
          fedilink
          English
          arrow-up
          31
          ·
          5 天前

          SATA is really convenient for larger storage, though. I keep my OS on nvmes, but I’ve got a couple of SATA drive and a hot swap bay for games, media, etc.

          • clif@lemmy.world
            link
            fedilink
            English
            arrow-up
            22
            ·
            edit-2
            5 天前

            I’m still running SATA spinny disks for my big-ish data. I can’t afford a 16TB SSD…

            I know that’s off topic, but HDDs are still a thing too.

            • RamRabbit@lemmy.world
              link
              fedilink
              English
              arrow-up
              13
              ·
              5 天前

              I’m very excited for the day I can replace my spinners with SSDs. That day is coming, but it is not today.

            • Valmond@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              5 天前

              They have become expensive too IMO, a 3-4 TB drive costs more today than a couple of years ago, and the used market here in europe is insane.

        • AlfredoJohn@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 天前

          And how many motherboards have the same amount of m.2 slots as they do sata slots? And what generation? So now I need new ram which is inflated to high hell, a new motherboard and cpu to increase storage on my gaming rig? Its not like games are small these days I like to keep most games i have installed and that takes multiple terabytes of storage that is cheaper to do via sata ssds… this is clearly anti consumer and done purely to push people to newer systems in the hope people stay with windows instead of swapping to linux. Its being done to keep the ai bubble going…

        • Lfrith@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 天前

          I have one m.2 and multiple sata ssd, since on my motherboard occupying the second m.2 slot would drop the pcie lane for my GPU due to sharing bandwidth.

          Do newer boards not have that problem?

          • Spaz@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            5 天前

            Higher spec boards dont have this issue; Typically an issue with low and mid range boards due to cost savings.

            • AlfredoJohn@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 天前

              Which just also shows why this is a very anti consumer move. Its trying to artifically push people to by new hardware because there hasn’t been significant enough changes to really warrant it. This then means more people who might have swapped off of windows to keep their existing hardware might end up having to upgrade then stick with their familiar windows platform so that the ai bubble can continue. Its completely fucked up

      • A_Random_Idiot@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        ·
        edit-2
        5 天前

        Yeah, but I think SATA is quickly being relegated to large mechanical storage drives. For things that don’t require performance, like storage and what have… because SATA is not getting any faster, I doubt anyones gonna come out with a SATA IV standard at this point, when PCIE over M2 is easier, simpler, and faster, and… outside of silicon shortage stupidities, getting cheaper and more affordable.

      • fuckwit_mcbumcrumble@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        5 天前

        Comes with them, but only for legacy media. Outside of my NAS I haven’t bought a new sata drive in probably 10 years. And I haven’t touched my onboard sata ports in 5.

        The fact that they’re still there impresses me at this point. But their numbers are slowly dwindling. Sata is usually the first thing that gets dropped when you need more pcie lanes. And even then most boards only have 4 at this point. They’re switching back to those god awful vertical ports which tells you all you need to know about their priority.

      • SinningStromgald@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        5 天前

        Most people at least put their OS on M2. I guess if you haven’t upgraded since M2 became common on motherboards you might not.

        Edit: I internet says M2 was common around 2016 2017 motherboards.

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    4 天前

    AFAIK this has already been a problem, you can find Samsung M.2 SSDs for cheaper than Samsung SATA SSDs at the same capacity, because their cloud customers have all flown past classic SATA/SAS for NVME U.2 and U.3, which is much more similar to M.2 due to NVME.

    I was planning on adding a big SSD array to my server which has a bunch of external 2.5 SAS slots, but it ended up being cheaper and faster to buy a 4 slot M.2 PCIe card and buy 4 M.2 drives instead.

    Putting it on a x16 PCIe slot gives me 4 lanes per drive with bifurication, which gets me the advertised maximum possible speed on PCIe 4.

    Whether or not the RAM surge will affect chip production capacity is the real issue. It seems all 3 OEMs could effectively reduce capacity for all other components after slugging billions of dollars into HBM RAM. It wouldn’t just be SSDs, anything that relies on the same supply chain could be heavily affected.

    • iglou@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 天前

      Exactly this. Micron ended their consumer RAM. Sansung here is just stopping producing something that is arguably outdated, and has a perfectly fine, already more available, most often cheaper or equivalent modern replacement.

  • lechekaflan@lemmy.world
    link
    fedilink
    English
    arrow-up
    71
    arrow-down
    1
    ·
    edit-2
    5 天前

    Yet another chapter in the fucking AI craze started up by them fucking techbros.

    Also, someone forgot that in some places in the world, people have to use older PCs with SATA drives. That, until their discontinuation announcements, Crucial and Samsung SATA drives were several tiers better than, say, those cheapo Ramsta drives.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      34
      ·
      5 天前

      Discontinuing outdated tech has nothing to do with AI. SATA SSDs need to be retired. NVME is superior and widely available.

  • Kyden Fumofly@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    5 天前

    The leak comes after another report detailed that Samsung has raised DDR5 memory prices by up to 60%.

    MF… And why they wind down SSD production this time? Last time was 2 years ago, because the SSD prices were low and they wanted to raise them (which happened).

        • GreenKnight23@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 天前

          we all know as soon as big bad chip daddy comes back with a big discount everyone not in this thread (and even some that are) will spread their cheeks and beg for more.

          humans are dumb greedy little assholes that have zero willpower. that’s why it’s so easy to manipulate us.

  • EndlessNightmare@reddthat.com
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    1
    ·
    5 天前

    Cries in PC gamer

    I’m glad I already have a good setup and shouldn’t be buying anything for a good while, but damn it. First the GPU, then RAM, now SSDs.

    • WorldsDumbestMan@lemmy.today
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      5 天前

      I ordered an S10 tab, paid my first rate, they finally try to order it, inform me it’s gone from the page, and try to get me to pay MORE for a weaker device.

      I refuse, ask for a refund, and that is how I got screwed over last moment from owning something I need, just before the crash.

      • Kazumara@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        5 天前

        I ordered an S10 tab, paid my first rate, they finally try to order it

        Who is “they” in this? Some sort of intermediary you were using?

  • dependencyinjection@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    41
    ·
    5 天前

    When I built a PC a couple of years ago when I really didn’t need one, then over specced it just because. I’m very happy right now as the prices are insane, feel like I could sell the PC for more than it cost me which mental.

    • Randelung@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      5 天前

      Don’t worry, you can use AI on anything that can access the internet! No need to ever have personal (let alone private) thoughts - I’m sorry, data - again.

      MS has been trying to get you to give up your personal computer for years. Do everything in the cloud, please! Even gaming with Stadia! And now they’re getting their wish. All it took was running the entire global economy.

  • Logical@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    4 天前

    Glad that I recently bought a bunch of storage so that I’ll be covered for a good amount of time.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    4 天前

    Aside: WTF are they using SSDs for?

    LLM inference in the cloud is basically only done in VRAM. Rarely stale K/V cache is cached in RAM, but new attention architectures should minimize that. Large scale training, contrary to popular belief, is a pretty rare event most data centers and businesses are incapable of.

    …So what do they do with so much flash storage!? Is it literally just FOMO server buying?

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      edit-2
      4 天前

      Storage. There aren’t enough hard drives, so datacentres are also buying up SSDs, since it’s needed to store training data.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        6
        ·
        4 天前

        since it’s needed to store training data.

        Again, I don’t buy this. The training data isn’t actually that big, nor is training done on such a huge scale so frequently.

        • finitebanjo@lemmy.world
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          1
          ·
          edit-2
          4 天前

          As we approach the theoretical error rate limit for LLMs, as proven in the 2020 research paper by OpenAI and corrected by the 2022 paper by Deepmind, the required training and power costs rise to infinity.

          In addition to that, the companies might have many different nearly identical datasets to try to achieve different outcomes.

          Things like books and wikipedia pages aren’t that bad, wikipedia itself compressed is only 25GB, maybe a few hundred petabytes could store most of these items, but images and videos are also valid training data and that’s much larger, and then there is readable code. On top of that, all user inputs have to be stored to reference them again later if the chatbot offers that service.