• TheGrandNagus@lemmy.world
    link
    fedilink
    English
    arrow-up
    198
    arrow-down
    3
    ·
    3 months ago

    I am so tired of people, especially people who pretend to be computer experts online, completely failing to understand what Moore’s Law is.

    Moore’s Law != “Technology improves over time”

    It’s an observation that semiconductor transistor density roughly doubles every ~2 years. That’s it. It doesn’t apply to anything else.

    And also for the record, Moore’s Law has been dead for a long time now. Getting large transistor density improvements is hard.

    • WolfLink@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 months ago

      Also the improvements in computer speed from Moore’s law were from Denard Scaling, which says with transistors 2x smaller, you can run things 2x faster but produce 2x as much heat.

      Heat dissipation has been the bottleneck for a long time now.

    • webghost0101@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      34
      ·
      3 months ago

      Sure, but also no.

      More’s law is at the most fundamental level a observation about the exponential curve of technological progress.

      It was originally about semiconductor transistors and that is what Moore was specifically looking at but the observed pattern does 100% apply to other things.

      In modern language the way language is used and perceived determines its meaning and not its origins.

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        edit-2
        3 months ago

        More’s law is at the most fundamental level a observation about the exponential curve of technological progress.

        No. Let me reiterate:

        Moore’s Law was an observation that semiconductor transistor density roughly doubles every ~2 years.

        It is not about technological progress in general. That’s just how the term gets incorrectly applied by a small subsect of people online who want to sound like they’re being technical.

        Moore’s Law is what I described above. It is not “technology gets better”.

        • webghost0101@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          5
          ·
          edit-2
          3 months ago

          I meant that sentence quite literally, semiconductor is technology. My perspective is that original “moors law” is only a single example of what many people will understand when they hear the term in a modern context.

          At some point where debating semantics and those are subjective, local and sometimes cultural. Preferable i avoid spending energy on fighting about such.

          Instead il provide my own line of thinking towards a fo me valid reason of the term outside semiconductors. I am open to suggestions if there is better language.

          From my own understanding i observe a pattern where technology (mostly digital technology but this could be exposure bias) gets improving at an increasingly fast rate. The mathematical term is exponential.

          To me seeing such pattern is vital to understand whats going on. Humans are not designed to extrapolate exponential curves. A good example is AI, which large still sucks today but the history numbers don’t lie on the potential.

          I have a rather convoluted way of speaking, its very unpractical.

          Language,at best, should just get the message across. In an effective manner.

          I envoke (reference) moores law to refer to the observation of exponential progress. Usually this gets my point across very effectively (not like such comes up often in my everyday life)

          To me, moors law in semiconductors is the first and original example of the pattern. The fact that this interpretation is subjective has never been relevant to getting my point across.

      • Lvxferre@mander.xyz
        link
        fedilink
        English
        arrow-up
        8
        ·
        3 months ago

        In modern language the way language is used and perceived determines its meaning and not its origins.

        This is technically correct but misleading in this context, given that it falsely implies that the original meaning (doubling transistor density every 2y) became obsolete. It did not. Please take context into account. Please.

        Furthermore you’re missing the point. The other comment is not just picking on words, but highlighting that people bring “it’s Moore’s Law” to babble inane predictions about the future. That’s doubly true when people assume (i.e. make shit up) that “doubling every 2y” applies to other things, and/or that it’s predictive in nature instead of just o9bservational. Cue to the OP.

        • udon@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          3 months ago

          Please take context into account. Please.

          (this is a lil’ lemmy thread and I think everyone understands what OP had in mind)

      • CmdrShepard42@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        3 months ago

        but the observed pattern does 100% apply to other things.

        Sure, if you retroactively go back and look for patterns where it matches something but that isn’t a very useful exercise.

  • lemmyng@lemmy.ca
    link
    fedilink
    English
    arrow-up
    107
    ·
    3 months ago

    Moore’s law is about circuit density, not about storage, so the premise is invalidated in the first place.

    There is research being done into 5D storage crystals, where a disc can theoretically hold up to 360TB of data, but don’t hold your breath about them being available soon.

    • sorghum@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      14
      ·
      3 months ago

      I always thought the holographic 3D discs were going to be a really cool medium in the infacy days of bluray and hd-dvd. I can’t believe that’s is been over a decade since the company behind it went bankrupt.

    • JackGreenEarth@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      Probably a stupid question, but how can the crystals be 5d if oir universe is (at a meaningful scale) 4d?

      • lemmyng@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 months ago

        Not a stupid question at all. Here’s the Wikipedia article for it. The significant part is this:

        The 5-dimensional discs [have] tiny patterns printed on 3 layers within the discs. Depending on the angle they are viewed from, these patterns can look completely different. This may sound like science fiction, but it’s basically a really fancy optical illusion. In this case, the 5 dimensions inside of the discs are the size and orientation in relation to the 3-dimensional position of the nanostructures. The concept of being 5-dimensional means that one disc has several different images depending on the angle that one views it from, and the magnification of the microscope used to view it. Basically, each disc has multiple layers of micro and macro level images.

      • catloaf@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        Wavelength could add a dimension. For example, if you have an optical disc (2D) that can be read and written separately by red and blue lasers, that makes it 3D.

    • HamsterRage@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      4
      ·
      3 months ago

      This is true, but…

      Moore’s Law can be thought of as an observation about the exponential growth of technology power per $ over time. So yeah, not Moore’s Law, but something like it that ordinary people can see evolving right in front of their eyes.

      So a $40 Raspberry Pi today runs benchmarks 4.76 times faster than a multimillion dollar Cray supercomputer from 1978. Is that Moore’s Law? No, but the bang/$ curve probably looks similar to it over those 30 years.

      You can see a similar curve when you look at data transmission speed and volume per $ over the same time span.

      And then for storage. Going from 5 1/4" floppy disks, or effing cassette drives, back on the earliest home computers. Or the round tapes we used to cart around when I started working in the 80’s which had a capacity of around 64KB. To micro SD cards with multi-terabyte capacity today.

      Same curve.

      Does anybody care whether the storage is a tape, or a platter, or 8 platters, or circuitry? Not for this purpose.

      The implication of, “That’s not Moore’s Law”, is that the observation isn’t valid. Which is BS. Everyone understands that that the true wonderment is how your Bang/$ goes up exponentially over time.

      Even if you’re technical you have to understand that this factor drives the applications.

      Why aren’t we all still walking around with Sony Walkmans? Because small, cheap hard drives enabled the iPod. Why aren’t we all still walking around with iPods? Because cheap data volume and speed enabled streaming services.

      While none of this involves counting transistors per inch on a chip, it’s actually more important/interesting than Moore’s Law. Because it speaks to how to the power of the technology available for everyday uses is exploding over time.

      • ch00f@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        3 months ago

        Moore’s law factored in cost, not just what was physically possible.

        The complexity for minimum component costs has increased at a rate of roughly a factor of two per year. Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years.

      • Random_Character_A@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        About 5 years ago I pirated all the games ever normally published for my childhood gaming system and my friends different gaming system.

        If I went to the past and told that to my younger self and that it all fits in a pinky finger nail sized medium, I wouldn’t have belived me. It’s just so far out there.

      • rebelsimile@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Yeah taken as a guideline and observation that computer speeds/storage/etc continue to improve, I think it’s fair. It may not always be double, but it is still significantly different than other physical processes which have “stagnated” by a similar metric (like top speed on an average vehicle or miles per gallon).

  • Lvxferre@mander.xyz
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    3 months ago

    I gave the subject a check. From Tom’s Hardware, industry predictions are like:

    Year Capacity (in TB)
    2022 1~22
    2025 2~40
    2028 6~60
    2031 7~75
    2034 8~90
    2037 10~100

    Or, doubling roughly each 4y. Based on that the state of art disks would 500TB roughly in 2040. Make it ~2050 for affordable external storage.

    However note that this is extrapolation over a future estimation, and estimation itself is also an extrapolation over past trends. Might as well guess what I’m going to have for lunch exactly one year for now, it’ll be as accurate as that.

    To complicate things further currently you have competition between two main techs, spinning disks vs. solid state. SSD might be evolving on a different pace, and as your typical SSD has less capacity it might even push the average for customers back a bit (as they swap HDDs with SSDs with slightly lower capacity).

      • adavis@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        edit-2
        3 months ago

        While not hard drives, at $dayjob we bought a new server out with 16 x 64TB nvme drives. We don’t even need the speed of nvme for this machines roll. It was the density that was most appealing.

        It feels crazy having a petabytes of storage (albeit with some lost to raid redundancy). Is this what it was like working in tech up till the mid 00s with significant jumps just turning up?

        • InverseParallax@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          3 months ago

          This is exactly what it was like, except you didn’t need it as much.

          Storage used to cover how much a person needed and maybe 2-8x more, then datasets shot upwards with audio/mp3, then video, then again with Ai.

        • toddestan@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 months ago

          The size increase in hard drives around that time was insane. Compared to the mid-90’s which was just a decade ago, hard drives capacities increased around 100 times. On average, drive capacities were doubling every year.

          Then things slowed down. In the past 20 years, we’ve maybe increased the capacities 30-40 times for hard drives.

          Flash memory, on the other hand, is a different story. Sometime around 2002-3 or so I paid something like $45 for my first USB flash drive - a whole 128MB of storage. Today I can buy one that’s literally 1000 times larger, for around a third of that price. (I still have that drive, and it still works too!)

      • 9point6@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        3 months ago

        I guess you’re expected to set those up in a RAID 5 or 6 (or similar) setup to have redundancy in case of failure.

        Rebuilding after a failure would be a few days of squeaky bum time though.

        • InverseParallax@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 months ago

          At raid6, rebuilds are 4.2 roentgens, not great but they’re not horrible. Keep old backups.but the data isn’t irreplaceable.

          Raid5 is suicide if you care about your data.

    • Buffalox@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      3 months ago

      I’m more shocked how little I need extra space!
      I’m rocking an ancient 1TB for backups. And my main is a measly 512GB SSD.
      But I don’t store movies anymore, because we always find what we want to see online, and I don’t store games I don’t actively use, because they are in my GOG or Steam libraries.
      With 1 gigabit per second internet, it only takes a few minutes to download anyways.

      Come to think of it, my phone has almost as much space for use, with the 512GB internal storage. 😋
      Maybe I’m a fringe case IDK. But it’s a long time since storage ceased to be a problem.

        • Buffalox@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 months ago

          I can understand that having your own copy is nice, especially if the service is closed for some reason.
          I just don’t bother doing that anymore, I prefer browsing my library on GOG instead of a file-manager.

  • udon@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    2
    ·
    edit-2
    3 months ago

    We can argue as much as we want about whether moore’s law covers technological development in general or be pedantic like good old fundamental Christians and only read what the words say.

    The bigger problem is that we have reached the era of what we could tentatively call “wal s’eroom”. Thanks to enshittification (another one of those slippery words!) I predict that technological progress reverses from now on by 50% every 2 years.