Panther Lake and Nova Lake laptops will return to traditional RAM sticks

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    2 months ago

    I’ve commented many times that Arc isn’t competitive, at least not yet.
    Although they were decent performers, they used twice the die size for similar performance compared to Nvidia and AMD, so Intel has probably sold them at very little profit.
    Still I expected them to try harder this time, because the technologies to develop a good GPU, are strategically important in other areas too.
    But maybe that’s the reason Intel recently admitted they couldn’t compete with Nvidia on high end AI?

    • InverseParallax@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      2 months ago

      Arcs are OK, and the competition is good. Their video encode performance is absolutely unworldly though, just incredible.

      Mostly, they help bring the igpu graphics stack and performance up to full, and keep games targeting them well. They’re needed for that alone if nothing else.

        • InverseParallax@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          2 months ago

          I mean fine, but first gen, they can fix the features and yields over time.

          First gen chips are rarely blockbusters, my first gen chips were happy to make it through bringup and customer eval.

          Worse because software is so much of their stack, they had huge headroom to grow.

          • Buffalox@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            2 months ago

            First gen chips are rarely blockbusters

            True, yet Nvidia was a nobody that arrived out of nowhere with the Riva graphics cards, and beat everybody else thoroughly. ATi, S3, 3Dfx, Matrox etc.

            But you are right, these things usually take time, and for instance Microsoft was prepared to spend 10 years without making money on Xbox, because they saw it had potential in the long run.

            I’m surprised Intel consider themselves so hard pressed, they are already thinking of giving up.

            • InverseParallax@lemmy.world
              link
              fedilink
              English
              arrow-up
              9
              ·
              2 months ago

              True, yet Nvidia was a nobody that arrived out of nowhere with the Riva graphics cards, and beat everybody else thoroughly. ATi, S3, 3Dfx, Matrox etc.

              Actually, they didn’t.

              This was their first: https://en.wikipedia.org/wiki/NV1

              Complete failure, overpriced, undercapable, was one of the worst cards on the market at the time, and used quadratics instead of triangles.

              NV2 was supposed to power the dreamcast, and kept the quads, but was cancelled.

              But the third one stayed up! https://youtu.be/w82CqjaDKmA?t=23

              • Buffalox@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                edit-2
                2 months ago

                You are right.

                and used quadratics instead of triangles.

                Now that you mention it, I remember reading about that, but completely forgot.
                I remembered it as the Riva coming out of nowhere. As the saying goes, first impressions last. And I only learned about NV1 much later.

                But the third one stayed up!

                👍 😋

                But Intel also made the i815 GPU, So Arc isn’t really the first.

                • InverseParallax@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  2 months ago

                  Oof, yeah, they actually had another they didn’t release, based off pentium cores with avx512, basically knights landing with software support for graphics.

                  They were canceling projects like it was going out of style, which is sad, that would have been amazing for Ai.

                  • Buffalox@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    3
                    ·
                    2 months ago

                    Yes, there was the Xeon Phi, Knights Landing, with up to 72 cores, and 4 threads per core!
                    The Knights Landing was put into production though, but it was more a compute unit than a GPU.

                    I’m not aware they tried to sell it as a GPU too? Although If I recall correctly they made some real time ray tracing demos.

    • hamsterkill@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      Still I expected them to try harder this time, because the technologies to develop a good GPU, are strategically important in other areas too

      I think I read somewhere that they’re having problems getting AIB partners for Battlemage. That would be a significant impediment for continuing in the consumer desktop market unless Battlemage can perform better (business-wise) than Alchemist.

      They probably will continue investing in GPU even if they give up on Arc, it might just be for the specialized stuff.

    • 1rre@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      2 months ago

      Yeah true, plus I bought my a770 at pretty much half price during the whole driver issues and so eventually got a 3070 performing card for like $250, which is an insane deal for me but no way intel made anything on it after all the rnd and production costs

      The main reason Intel can’t compete is the fact CUDA is both proprietary and the industry standard, if you want to use a library you have to translate it yourself which is kind of inconvenient and no datacentre is going to go for that

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        The main reason Intel can’t compete is the fact CUDA is both proprietary and the industry standard

        AFAIK the AMD stack is open source, I’d hoped they’d collaborate on that.

        • 1rre@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          2 months ago

          I think intel support it (or at least a translation layer) but there’s no motivation for Nvidia to standardise to something open-source as the status quo works pretty well

      • Trainguyrom@reddthat.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        The main reason Intel can’t compete is the fact CUDA is both proprietary and the industry standard

        Funnily enough this is actually changing because of the AI boom. Would-be buyers can’t get Nvidia AI cards so they’re buying AMD and Intel and reworking their stacks as needed. It helps that there’s also translation layers available now too which translate CUDA and other otherwise vebdor-specific stuff to the open protocols supported by Intel and AMD