• ☂️-@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    17 hours ago

    let’s focus on letting people have what we can already make.

    what’s the point of 0.0000001nm superchips if only ai techbros have them?

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    1 day ago

    AFAIK the smallest usable atom is about 150 picometer carbon, and the smallest amount of atoms theoretically possible to make a transistor is 3, so there is (probably) no way to go below 450 picometer. There is probably also no way to actually achieve 450 picometer which is the same as 0.45 nanometer.
    So the idea that they are currently going below 2nm is of course untrue, but IDK what the real measure is?

    What they are doing at the leading chip manufacturing factories is amazing, so amazing it’s kind of insane. But it’s not actually 2nm.

    Just for info, one silicon/silicium atom is 0.2 nm.

    • chonomaiwokurae@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      The whole idea of somehow representing different nodes and their development with one number is a bit silly. That being said, it looks like future channel materials could be 0,7 nm in thickness (monolayer WX2).

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      For a while now the “nm” has been a bit of a marketing description aiming for what the size would be if you extrapolated the way things used to be to today. The industry spent so long measuring that when the measurement broke down they just kind of had to fudge it to keep the basis of comparison going, for lack of a better idea . If we had some fully volumetric approach building these things equally up in three dimensions, we’d probably have less than “100 pm” process easily, despite it being absurd.

        • jj4211@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          As I said. It’s an extrapolation of the rules from once upon a time to a totally different approach. It’s marketing and increasingly subjective. Any number can “make sense” in that context. The number isn’t based on anything you could actually measure for a long time now, it’s already a fiction, so it can go wherever.

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          People have accepted heat pumps as 400% efficient. This is the same.

          And realistically, how do you describe in an approachable way “you experience what would look like an impossible number if we had continued as before”, where the “if” is key, as is “you experience”

          • Bradley Nelson@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            For what it’s worth, I think the heat pump measurement makes way more sense. What I want is to heat my house. I give you one watt hour and you give me 4 watt hours of heat. Sounds like 400% to me.

            The real issue here is that for the most part the measurements never meant anything for silicon chips. At least too end users.

  • urshilikai@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    edit-2
    2 days ago

    can we please socially murder the sales/marketing team that rebranded the unit in nodes from something physically meaningful to a random countdown detached from reality? (1nm node does not have any bearing on critical dimension or size of the circuits)

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      To be fair, the industry spent decades measuring a distance, so when they started doing features that had equivalent effects, the easiest way for people to understand was to say something akin to equivalent size.

      Of course, then we have things like Intel releasing their "10 nm* process, then after TSMC’s 7nm process was doing well and Intel fab hit some bumps, they declared their 10 to be more like a 7 after all… it’s firmly all marketing number…

      Problem being no one is suggesting a more objective measure.

      • JohnEdwa@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        2 days ago

        Open any wikipedia article about “x nm process” and one of the first paragraphs will be something like this:

        The term “2 nanometer”, or alternatively “20 angstrom” (a term used by Intel), has no relation to any actual physical feature (such as gate length, metal pitch or gate pitch) of the transistors. According to the projections contained in the 2021 update of the International Roadmap for Devices and Systems published by the Institute of Electrical and Electronics Engineers (IEEE), a “2.1 nm node range label” is expected to have a contacted gate pitch of 45 nanometers and a tightest metal pitch of 20 nanometers.[1]

        It used to be that the “60nm process” was called that because the transistor gate was 60nm.