• melfie@lemy.lol
    link
    fedilink
    English
    arrow-up
    1
    ·
    31 minutes ago

    2026 is going to suck for hardware, but 2027 might be better if this nonsense blows over. For one thing, AMD’s RDNA 5 was announced for 2027, which is supposed to be more comparable to Nvidia for compute workloads, including real RTX cores. AMDs recent SoCs have been pretty impressive, so I’m looking forward to AMD SoCs that are competitive with Nvidia discrete GPUs beyond just rasterization, except without artificially constrained VRAM and lower power requirements.

  • rogsson@piefed.social
    link
    fedilink
    English
    arrow-up
    28
    ·
    5 hours ago

    When the yet-to-be data centers never get built because AI slop bubble pops, we will be able to build houses out of RAM sticks for the poor

  • 1984@lemmy.today
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    6 hours ago

    Im on Linux and it requires just as much memory as it did in 2018. No problem here.

    • pHr34kY@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      6 hours ago

      I upgraded mine from 16GB to 32GB two years ago beacuse RAM was cheap. I didn’t really need it, and have probably never hit 16GB usage anyway.

      Meanwhile my work Windows laptop uses 16GB at idle after first login.

      Windows has always been wasteful computing, and everyone just pays for it.

      • Bilb!@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 hours ago

        Requiring less RAM is good, but conceptually, it’s Linux that is “wasting” the RAM by never using it. It’s there, and it’s reusable, fill it up! Now, does Windows make good use of it? No idea. Wouldn’t bet on it, but I could be surprised.

      • Waraugh@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        3 hours ago

        Storing data in ram isn’t wasteful though, I have a lot of criticisms of windows but the memory management isn’t one of them. I’d rather have as much predictive content be staged in ram as possible as long as it’s readily dumped out if I go to do something else, which is my experience. Like I don’t earn interest for having unused RAM on my computers (for reference I have an endeavorOS, rhel, fedora, and windows computers under my desk connected to a dual monitor kvm right now; it isn’t like I don’t regularly use/prefer Linux; I mostly access my windows machine via rustdesk for work related stuff I don’t feel like having to dick with on Linux like the purchase order system and Timecard system), I just don’t get this critique.

        • pHr34kY@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          34 minutes ago

          Linux doesn’t waste RAM. All unused RAM becomes a disk read cache, but remains available on demand.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        5 hours ago

        I wish I had a 32gb ram laptop.

        I can have 3 development IDEs open at once, and with all the browser tabs open and a few other programs here and there its stretching the limits on my Mac.

        • pHr34kY@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          5 hours ago

          I have 32GB on my Windows PC laptop it can’t do three at once.

          Running the backend (java) and the frontend (react native) in intellij uses 29GB RAM, so I must run Android on real hardware over ADB+USB. Running an android simulator pushes it over the edge.

          Also: Laptops are shit. On Windows, the tau is so bad that the cores are throttled straight after boot because the cooling is rubbish. It almost never hits full speed. It can’t survive more than 40 minutes on a full battery. It might as well be a NUC.

          • GenosseFlosse@feddit.org
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 hours ago

            Also: Laptops are shit. On Windows, the tau is so bad that the cores are throttled straight after boot because the cooling is rubbish. It almost never hits full speed. It can’t survive more than 40 minutes on a full battery.

            That’s the reason I have not bought a new laptop in years. Everything must be as thin as possible because apple did it. Fuck that. I want my laptop as thick as a brick to have enough cooling for CPU, GPU and a 6l V8 engine, and a battery that will outlast the sun!

            • RisingSwell@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 hours ago

              Does Clevo still make the fat laptops? My last was one of theirs and it was almost as thick as my fore arm. It also weighed a ton but on the plus side insanely easy disassembly. I probably should’ve got another one, my MSI is shit to open

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            edit-2
            5 hours ago

            Ya, macs are definitely more efficient with their ram.

            I’ll have Android Studio open for my main work, Intellij Idea for all the backend work, and Xcode when I need to tweak some iPhone things. (edit: usually it’s just 2 of the 3, but sometimes its all 3)

            I also mainly use real devices for testing,and opening emulators if all 3 are open can be a problem, and it’s so annoying opening and closing things.

    • samus12345@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      26
      ·
      8 hours ago

      Just about all electronics older than a year or so have. Even a Switch, which came out 9 years ago, costs more to buy now than it did then!

    • Asmodeus_Krang@infosec.pub
      link
      fedilink
      English
      arrow-up
      42
      ·
      10 hours ago

      It’s truly mental. I don’t think I could afford to build my PC at the same spec today with RAM and SSD prices being what they are.

      • tempest@lemmy.ca
        link
        fedilink
        English
        arrow-up
        17
        ·
        8 hours ago

        I have 128 GB of ddr5 memory in my machine. I paid 1400 for my 7900xtx which I thought was crazy and now half my ram is worth that.

        Never thought I would see the day where the graphics card was not the most expensive component.

          • tempest@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 hour ago

            I should not have even gotten the 128.

            I can use it but barely at 4600 because ryzen chips can’t handle 4 dimms of 32gvb.

            I honestly didn’t even bother to check at the time of purchase and it is is still a roll of the dice if I restart.

  • Jhex@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    2
    ·
    9 hours ago

    This article sucks… I think they felt the need to excuse AI lest they upset corporate masters

    While it’s easy to point the finger at AI’s unquenchable memory thirst for the current crisis, it’s not the only reason.

    Followed by:

    DRAM production hasn’t kept up with demand. Older memory types are being phased out, newer ones are steered toward higher margin customers, and consumer RAM is left exposed whenever supply tightens.

    Production has not kept up with demand… demand being super charged by AI purchases

    …newer ones are steered towards higher margin customers… again AI

    consumer RAM is left exposed whenever supply tightens… because of AI

    • AeonFelis@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      3 hours ago

      You see, it’s easy to blame AI data centers buying all the RAM - but that’s only half the story! the other half of the story is manufacturers selling to these data centers

  • blitzen@lemmy.ca
    link
    fedilink
    English
    arrow-up
    70
    ·
    11 hours ago

    Apple over here not raising their RAM prices because they’ve always been massively and unjustifiably inflated. Now, they’re no longer unjustifiably inflated.

    • mushroommunk@lemmy.today
      link
      fedilink
      English
      arrow-up
      33
      arrow-down
      1
      ·
      10 hours ago

      I dunno. “AI companies bought literally everything” seems like an unjustifiable reason still.

      • blitzen@lemmy.ca
        link
        fedilink
        English
        arrow-up
        22
        ·
        10 hours ago

        Perhaps. I guess my point is they no longer are as out-of-line with the rest of the market. Comment meant as a backhanded “compliment” toward Apple.

  • garretble@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    ·
    10 hours ago

    Me to my 10 year old gaming pc: “I guess it’ll be another couple of years, buddy.”