• RedWeasel@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    ·
    edit-2
    1 day ago

    So, around 1947. Took about 14 years to get to being able to put into chips. So another decade and a half?

    Edit: and another 15 to 25 years after that for it to be in consumer households?

    • kutt@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      5 hours ago

      I don’t think it will ever reach consumer households, since it requires extremely complex and expensive materials, tools and physical conditions. Unless a major breakthrough occurs but highly unlikely.

      Also we don’t really have a use for them, at least to regular users. They won’t replace classical computers.

      But you can already access some QCs online. IBM has a paid remote API for instance.

      • RedWeasel@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        I can currently only see them used as accelerators of some type right now. Could see them used potentially for GPUs, but generally I suspect some form of compute first. GenAI anyone? SkyNET? But that is only if they can be made portable for laptops or phones which is still a major issue still needing to be addressed.

        I don’t expect them to replace traditional chips in my lifetime if ever.

        • a_non_monotonic_function@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 hours ago

          Could see them used potentially for GPU

          Like used as GPUs or like GPUs. The latter, certainly. The former not as much. They aren’t a replacement for current tech they accelerate completely different things (and they really do nothing currently that your average consumer would be interested in anyway).

        • kutt@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 hours ago

          Yes they will probably never replace them because they’re actually slower than classical computers in doing simple calculations.

          Quantum ML is actively being researched. However I am not informed at all about the advancement in this field specifically.

          But the good news is that it doesn’t need to be portable, we can use them just as we do right now with remote access!

      • baggachipz@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        5 hours ago

        requires extremely complex and expensive materials, tools and physical conditions.

        Counterpoint: they said the same thing when a computer was made of vacuum tubes and took up an entire room to add two digits.

        • kutt@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 hours ago

          Yeah but you have to consider one other thing. Before creating classical computers, we already had theorized them, we had algorithms etc. We knew why we were creating them.

          For QC, the pace of hardware development is faster than our ability to create algorithms. It’s very similar to what’s happening with the AI bubble currently, we’re investing heavily in a new technology because it looks cool to investors, but we don’t even have enough algorithms to run on it. It’s just a shit ton of marketing…

          • baggachipz@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 hours ago

            Yeah, understood. I was just saying that because it doesn’t seem technically possible now, don’t discount that it could be in the future. Whether it would be useful, that’s another debate. But I have a hard time believing it has practical uses. If it does though, the innovation will be rapid like the shift to silicon transistors (assuming it is even possible).

            • kutt@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 hours ago

              Oh I’m not saying it is technically impossible, it’s the opposite actually, it’s developing extremely fast. And usefulness and having QCs in our homes aren’t that far apart to be honest. Why would John Doe have a QC at home if he’s not trying to create a new medication, or simulate a molecule? Probably for the same reasons he doesn’t have an MRI machine in his living room :)

    • photonic_sorcerer@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      26
      ·
      1 day ago

      From the byline:

      Quantum tech is at its transistor moment—promising, real, and powerful, but still years of hard work away from changing the world

      So pretty much, yeah.

    • funkajunk 🇨🇦@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      1 day ago

      Seeing as we now have a multitude of tools available to us that we didn’t have in 1947, I imagine it would be faster.

      • Gsus4@mander.xyzOP
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 day ago

        And an already existing consumer base with expectations that were only for hobbyists before…maybe that’s a bad thing, because it will constrain QC to evolve in ways that it would be better to explore rather than try to fit modern use cases (or worse: MBA-driven hype)