Intel doesn’t think that Arm CPUs will make a dent in the laptop market::“They’ve been relegated to pretty insignificant roles in the PC business.”

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    78
    arrow-down
    13
    ·
    11 months ago

    Nobody tell Intel about Apple Silicon! Or that Apple’s sales are increasing while they rest of the industry is in a slump.

      • BB69@lemmy.world
        link
        fedilink
        English
        arrow-up
        19
        ·
        11 months ago

        They swapped to M series chips, what, two years ago? This says sales this year are down due to no new Macs.

        • just_another_person@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          12
          ·
          11 months ago

          Not everything runs on MacOS with Arm. Some people may not upgrade to M* class chips, and others who may have switched don’t want the hassle. I know plenty of developers who went to ThinkPads on Linux instead upgrading to M* architecture and having build issues.

          • mr_tyler_durden@lemmy.world
            link
            fedilink
            English
            arrow-up
            11
            arrow-down
            1
            ·
            11 months ago

            Plenty of developers? Ok, sure. It was rocky for less than a year after they released the M1’s. I barely had any issues on my M1 Max that I got at release and I was just thinking the other day about how in haven’t thought about “will this run” or “oh there’s that thing that doesn’t run” in forever.

            • bamboo@lemm.ee
              link
              fedilink
              English
              arrow-up
              7
              ·
              11 months ago

              Yeah, it really hasn’t been a hassle. At my workplace (software research, lots of which is actually x86-specific) many people have switched to Apple silicon Macs and nobody is looking back. The only issue I’ve noticed that is disruptive in any way is that Apple isn’t really supporting TAP based network adapters which causes trouble once in a while, mostly with certain vpn configurations. Standard development tools like IDEs, compilers, etc have worked since nearly day 1. Basically the only common targets that I wouldn’t develop for is Windows, but even then you can do it in a VM and it’s the fastest way to run windows on ARM still.

          • mesamune@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            11 months ago

            There’s a lot of brew packages that messed up when the chips came out. It’s still a bit of an issue two years later.

      • bamboo@lemm.ee
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        11 months ago

        The M1 series was super good and Apple just hasn’t released anything since then worth upgrading to if you have an M1. They’re gaining market share though slowly, which indicates that their sales slump is lower than the market average.

        • just_another_person@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          2
          ·
          11 months ago

          Their selling points for the previous generation are now moot though, so that’s why people aren’t buying or re-upoing generations:

          • Gaming is null
          • Lots of issues for creators, including codec and transcoding issues (slight fixed if run through Rosetta)
          • Anything GPU-related is under lock and key by Apple drivers, no local AI or inference development.
          • Tons of FOSS projects just won’t build and work
          • All of the M* media extensions are only available through XCode

          There’s just tons of stuff that makes it unattractive to developers. My particular job requires building multi-arch containers and binaries, and it’s just a nightmare to dev and test locally. The argument to this point might be “just use cloud”, or “use a remote CI build system”, bit the point is you shouldn’t have to. I can have a machine that does everything I need it to do with another vendor, with way less hassle, and for way cheaper.

          • bamboo@lemm.ee
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            11 months ago
            • Gaming is indeed a big weak point for M series chips. Some games run, but it’s a pretty bad experience overall. Personally I’m insulated from this because I have a gaming PC I can use.
            • I’ve not heard this at all recently. This sounds like it was probably true in late 2020 but afaik creative workflows are quite flushed out at this point.
            • No idea what you’re talking about here. Tools like PyTorch fully support metal for training and inference. llama.cpp fully supports Apple silicon. Apple’s shared memory model gives their GPUs access to huge pools of memory compared to even high end discreet GPUs, and this allows working with models otherwise not possible in a laptop form factor. No other laptop GPU is getting shipped with up to 96GB of memory.
            • Really? This seems like a generally easy thing to fix, what projects (that ran on x86 macOS) are known to be bothersome?
            • This is a barrier to the Asahi Linux folks, and I hope someday the situation improves for them. Otherwise, it’s irrelevant since nearly everything for macOS is gonna be built through Xcode. It’s the system toolchain.

            I don’t know exactly what software you use for work, but for simple cases docker desktop uses binfmt-misc to enable Rosetta and qemu-user for containers. This actually makes it really easy to build and test for a bunch of different architectures, x86 but also ppc64le, mips, etc. With x86-64 specifically you get Rosetta for very high performance. I know tools like gdb don’t work right in this environment, but thats not usually part of a typical ci/cd system anyways.

            • just_another_person@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              4
              ·
              11 months ago

              Quick responses, sorry.

              • gaming is a problem for people who DO want this type of machine, and not a second or console. It’s only become an issue with M*
              • Apple has literally thrown an entire engineering group behind their ability to import, export, and transcode media. So much so they even have their own fork of ffmpeg. It’s atrocious.
              • PyTorch is CPU with GPU acceleration where applicable, and most devs want direct GPU access. Not possible on Apple hardware. LLMs are kinda dumb, and most people work on imaging inference. Direct access to the hardware for local development is a must.
              • Not sure what you mean, but Rosetta is essentially QEMU emulation, which is insanely slow. I can run a 2m build for something on a native x64 machine, or a 1 hour build through Rosetta. No thanks.
              • This is a power play by Apple thinking people will still buy their stuff and just deal with the inconveniences, but turns out the sales numbers show that is not the case.

              In general, the “use containers for everything” is not a good workflow. It’s also very subjective to performance on the platform you run it on. Containers all the time is exhausting and problematic for a number of different reasons.

          • ASeriesOfPoorChoices@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            5
            ·
            11 months ago

            Lol! No.

            Gaming: has just jumped up in a huge way in the last couple of months. The software that’s come out recently is amazing. Just run full windows games on your macbook pro? Sure.

            Creators: huh? Never heard of this before. Everything seems to run amazingly well.

            GPU: except for the local ai stuff that is being done all the time? What? Diffusion Bee doesn’t exist now?

            Cheaper: well, yeah. Heh. You get that one, for sure. But if money is no object, then… 🤷

            • just_another_person@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              11 months ago

              Bruh…there is NOTHING about MacOS which invites the ability to run “full windows on your MacBook”. I’m not sure form of Bath Salts you’re smoking, but you should share that with the rest of the fanboys so they can experience the same delusions as you. Also “gaming has jumped up…in the last couple of months”…jFC those drugs you’re on are amazing.

              All your other points make zero sense, and it’s obvious you are not experienced in software development, so you can go away from me now.

              • ASeriesOfPoorChoices@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                11 months ago

                I said “full windows games”, but yeah, you can also run Windows on it, but that’s old news.

                Crossover and GTPK. Look it up. Didn’t exist like it does now at the start of the year.

      • kalleboo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        11 months ago

        Be careful in trying to interpret year over year statistics. Last year was huge for Apple as if you look at Q3 2022 then Apple increased sales 10% while the rest of the PC market dropped a massive 18%.

        You’re saying “since switching from x86 to ARM apples sales are down! see it was a bad idea!” but actually they have been way way up and are just finally getting inline with the sales decline the rest of the PC industry has had after the covid work from home rush ended.

    • Trippin@feddit.nl
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      4
      ·
      11 months ago

      Do you have numbers? Cause I’m thinking at at 8.6% worldwide, it’s not really a big chunk of the pie. Especially as the article states, it’s declining compared to the year before.

      • GenderNeutralBro@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        2
        ·
        edit-2
        11 months ago

        The article you linked pretty much sums it up.

        Apple’s Mac market share increased to 8.6%, reporting year-over-year shipment growth of 10.3%, the only major manufacturer to do so.

        The year-over-year Mac shipment growth comes even as the broader market and competitors notch sharp declines in shipments, and as the Intel transition wraps up.

        Lenovo, HQ, Dell and Acer all had year-over-year drops in shipments, according to IDC data.

    • long_chicken_boat@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      20
      ·
      11 months ago

      you clearly don’t know what you’re talking about. Apple’s laptops sales are decreasing. And most Mac users can’t tell the differences between Intel, the M chips, AMD or whatever. They just know that there’s a pretty apple on the back of their laptop and that’s why they buy it.

      • AbidanYre@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        6
        ·
        edit-2
        11 months ago

        Some of those people know that docker performance is hot garbage on Macs.

        • jmanes@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          11 months ago

          Only if you’re not using AARCH64 based containers. At my job, we leverage the appropriate containers and performance is insanely good.

  • simple@lemm.ee
    link
    fedilink
    English
    arrow-up
    57
    arrow-down
    3
    ·
    11 months ago

    Of course intel would be the last company to admit x86 is dying. It just doesn’t make sense to keep doubling down on it anymore, Apple has proven ARM is more power efficient and in many cases more powerful than x86. I wanted to buy a new laptop this year but it makes no sense to do so considering Windows ARM machines are right around the corner and will triple battery life and increase performance.

    • MrSpArkle@lemmy.ca
      link
      fedilink
      English
      arrow-up
      27
      ·
      11 months ago

      Intel is a licensed ARM manufacturer. They’re just doing PR but are capable of playing both sides.

    • morrowind@lemmy.ml
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      11 months ago

      This seems to be doggedly persistent rumor. Apple’s M chips are better due to better engineering and vertical integration.

      There is no inherent benefit to the underlying isa

      • simple@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        11 months ago

        ARM has a more efficient instruction set, uses less power, and generates less heat while matching performance. Not really a rumor.

        • morrowind@lemmy.ml
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          1
          ·
          11 months ago

          Source?

          Here’s mine

          It’s down to the engineering. Saying ARM has a more efficient instruction set is like saying C has more efficient syntax than python. Especially these days with pipelining 'n stuff, it all becomes very similar under the hood.

          • Cocodapuf@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            11 months ago

            Source?

            Here’s mine

            That article may be out of date though. From the article:

            What limits computer performance today is predictability, and the two big ones are instruction/branch predictability, and data locality.

            This is true, and it points out one of the ways Intel has made their architecture so competitive, Intel has bet very heavily on branch prediction and they’ve done a lot of optimisation around it.

            But more recently branch prediction has proven to be quite problematic in terms of security. Branch prediction was the root of the problem that led to the meltdown and spectre vulnerabilities. And the only real mitigation for this problem was to completely redesign how branch prediction was done, and significantly reducing the performance gains.

            So yeah to sum up, one of the big differences between ARM and intel’s X86 architecture is branch prediction, except branch prediction just got nerfed big time.

    • NotSoCoolWhip@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      11 months ago

      CPU aside, it’s best to wait for thunderbolt 5 to mature. Might finally be able to go to using one device for travel and an eGPU for gaming.

      • LemmyIsFantastic@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        11 months ago

        I mean that’s fine. I’m just saying that x86 chips are still faster. If you want a beefy laptop, especially a work device that only needs to be slightly portable eg drag it to conference rooms and back to your desk, there is little current reason to go with ARM. I’m not saying they won’t catch up but folks in here seem to be thinking that ARM is currently faster.

      • simple@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        ·
        11 months ago

        M2 Max chips are close to the high end i9, but the M series cpus are mobile chips. They’re designed for laptops. If competition is a bit harder then no doubt desktop-focused ARM CPUs will match their performance soon.

        • AzureKevin@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 months ago

          AFAIK they’re large chips though, and larger generally is more performance but also much more expensive to manufacture.

  • arthurpizza@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    2
    ·
    11 months ago

    Intel is finally innovating because of increased pressure. Don’t let the Pat Gelsinger’s calm tone fool you, he knows exactly what the competition is bringing. Apple has proven what Linux users have known for a few years, the CPU architecture is not as directly tied to the software as it once was. It doesn’t matter if it’s x86, ARM, or RISC-V. As long as we have native builds (or a powerful compatibility layer) it’s going to be business as usual.

    • kalleboo@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      11 months ago

      the CPU architecture is not as directly tied to the software as it once was

      Yeah it used to be that emulating anything all would be slow as balls. These days, as long as you have a native browser you’re halfway there, then 90% of native software will emulate without the user noticing since it doesn’t need much power at all, and you just need to entice stuff that really needs power (Photoshop etc), half of which is already ARM-ready since it supports Macs.

      The big wrench in switching to ARM will be games. Game developers are very stubborn, see how all games stopped working on Mac when Apple dropped 32-bit support, even though no Macs have been 32-bit for a decade.

      • Powerpoint@lemmy.ca
        link
        fedilink
        English
        arrow-up
        7
        ·
        11 months ago

        The game support was pretty much crap even before then and a lot of the blame lies on Apple.

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    2
    ·
    11 months ago

    Even AMD showed just how power hungry and thermal inefficient intel generally is

    As Arm develops more every year, laptop OEMs will eventually switch just because of the insane power and thermal benefit.

    I hope RISC-V gets its chance to shine too

    • ichbinjasokreativ@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      11 months ago

      Current gen AMD laptop CPUs rival apple silicon in performance and power consumption on mobile. x86 is nowhere near as close to dying as people think.

      • JK_Flip_Flop@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        11 months ago

        Aye exactly, Apple’s marketing, which is often basically lying, has a lot to answer for in the prevelence of this idea. They’d have you believe that they’re making chips with 14 billion percent more performance per watt and class beating performance. Whereas in reality they’re very much going toe to toe with AMD and other high end ARM chip vendors

        • CobraChicken@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          11 months ago

          My M2 air is silent because it has no fans. I’ve never had any trouble doing any office / photoshop / illustrator work. Battery life lasts hours and hours and hours.

          It’s not all marketing, there’s substance too

          • JK_Flip_Flop@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            11 months ago

            Did I ever say it was all lies? They’re incredibly capable machines, I’d love to own one. I just take issue with Apple’s lark of transparency in the marketing of the performance of the chips vs competition.

        • Defaced@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          11 months ago

          Every vendor is guilty of doing this not just apple, even AMD. The fact is apple found a way to make desktop arm chips accessible and viable. If you’ve ever used an m1 or m2 Mac, you’ll understand how big of an impact they’ve made. My m1 Mac mini 8gb could run several games above 60fps at 1440p at reasonable settings, examples being WoW (retail with upgraded graphics), LoL and DotA2, StarCraft 2, diablo 3, etc. It was and still is a very capable chip.

      • jose1324@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 months ago

        Don’t agree with this. I have the newest 7840U that’s supposed to be THE answer to the M2. Performance per watt and battery life is way worse.

    • Never_Sm1le@lemdro.id
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      3
      ·
      11 months ago

      No laptop manufacturers would switch to arm until a good x86 compatibility comes along. People would make huge fuss if they can’t use their favorite apps or if those apps don’t run decently

    • jabjoe@feddit.uk
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      My fear is losing what we have x86 PCs in the standardization of the platform. ARM and even more RISC-V, is a messy sea of bespokeness. I want hardware to be auto-discoverable so a generic OS can be installed.

  • onlinepersona@programming.dev
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    1
    ·
    edit-2
    11 months ago

    Will Intel exist in 2026? NVIDIA and AMD are making ARM chips for 2025, China is investing heavily in RISC-V, and AMD already released a CPU that rivals Apple’s M2 which is x86. Who knows how things will turn out once they release an ARM chip.

    Things are shaping up to become an NVIDIA vs AMD arms race with some Chinese company becoming a dark horse and announcing a RISC-V chip in 2-3 years.

    There was a company that announced a major technological advancement in chip fabrication in the US, but I can’t remember who or what it was. My maggot brain thinks something with light-based chips or something? I dunno… that might also be something to look out for

    Edit: it was intel: Intel Demos 8-Core, 528-Thread PIUMA Chip with 1 TB/s Silicon Photonics

    • BetaDoggo_@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      11 months ago

      It will take at least another 10 years to get a majority of the market off of x86 with the 20+ years of legacy software bound to it. Not to mention all of the current gen x86 CPUs that will still be usable 10 years from now.

      • PopOfAfrica@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        11 months ago

        Honestly, we just need some sort of compatibility layer. Direct porting isn’t completely required yet.

      • Patch@feddit.uk
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        11 months ago

        You don’t really need the majority of the market to have moved before things start to get tricky for Intel. They’re a very much non-diversified company; the entire house is bet on x86. They’ve only just started dabbling in discrete GPUs, despite having made integrated GPU SOCs for years. Other than a bit of contract fabbing, almost every penny they make is from x86.

        If ARM starts to make inroads into the laptop/desktop space and RISC-V starts to take a chunk of the server market, the bottom could fall out of Intel’s business model fast.

      • onlinepersona@programming.dev
        link
        fedilink
        English
        arrow-up
        6
        ·
        11 months ago

        I’m not sure about that. If for example the EU says “for the environment, you may not use chips that use X watts/Ghz” or something, x86 might be out of the game pretty quickly. Also, becoming market leader doesn’t mean old hardware, it’s the new hardware. I bet by 2030, the majority of chipsets sold will be either ARM or RISC-V. AMD did make an ARM rival with the 7840U, but with their entry in to ARM in 2025, it’s not preposterous to believe the ARM ecosystem will pick up steam.

        Also, recompiling opensource stuff for ARM is probably not going to be a huge issue. clang and gcc already support ARM as a compilation target, and unless there’s x86 specific code in python or ruby interpreters, UI frameworks like Qt and GTK, they should be able to be compiled without much issue. If proprietary code can’t keep up or won’t keep up, the most likely outcome will be x86 emulators or the dumping of money into QEMU or stuff like Rosetta for windows.

        Anyway, I’m talking out of my ass here as I don’t write C/C++ and don’t have to deal with cross-compilation, nor do I have any experience in hardware. It’s all just a feeling.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        edit-2
        11 months ago

        I think it’s safe to say Apple has proved that wrong three times.
        When they switched from Motorola to Power, then from Power to Intel, and latest from Intel to Arm.
        If necessary software will be quickly modified, or it will run well enough on compatibility layers.

        The switch can happen very fast for new hardware. The old systems may stay around for a while, but the previous CPU architecture can be fazed out very quickly in new systems. Apple has proven that.

    • ██████████@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      edit-2
      11 months ago

      i think its neet how geopolitically this is all connected to the taiwan issue and only when the mainland can make chips as good as NvDia in taiwain will they be able to economically handle the invasion

      if they invade today gpus and cpus prices explode into dumbdum levels for a few years bro it wouod suck for the whole world

  • some_guy@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    12
    ·
    11 months ago

    Posturing. It’s already obvious that Arm is kicking ass. It may not take over, but it’s more than made a dent.

  • MystikIncarnate@lemmy.ca
    link
    fedilink
    English
    arrow-up
    9
    ·
    11 months ago

    A brief history lesson relating to Intel and ARM… Intel made ARM processors. They were not great. Of course, this was many many years ago, but even compared to others of the same from the same generation and year range, they were kind of poo.

    The product was Intel Xscale. Manufactured starting in 2002, and only lasted about 3-4 years before being dropped. Right before there was a big smartphone boom. The processors found their way into the smartphone predecessor, the PDA. Notably, I purchased one device with this type of processor right before the whole thing collapsed… A Dell Axiom x51v. It ran Windows Mobile, which later turned into Microsoft’s attempt to compete with the likes of Google and Apple in the smartphone space, and it’s obvious how that worked out for them.

    Intel is saying this because they have to believe it’s true. They’ve abandoned all ARM development and seem to have no intention of picking it up again. They failed in the ARM space, creating fairly poor versions of the chips that they did produce, and they seem to have no intention of repeating that failure.

    Mark my words, Intel will likely go all in on RISC-V if anything. They’ll continue to build x86, they have way too much invested in that space and it’s one of few that they’ve actually had significant success in, but when it comes to mobile/RISC, ARM isn’t something that they will be picking up again.

    So bluntly, this is true… For Intel. They must believe it because they have given themselves no alternative.

  • banneryear1868@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    11 months ago

    ARM just makes sense for portable devices for obvious reasons, x86 isn’t dying though. For the average person who needs a laptop to do some professional-managerial work ARM is perfect.

    • flying_sheep@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      11 months ago

      What are those reasons that you think are so obvious? I have no idea what you could be referring to 😅

      • banneryear1868@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        11 months ago

        ARM is more efficient and as a “system on chip” reduces the need for as many other components on the boards, phones for example. Unless you’re doing heavy cpu or gpu intensive tasks there’s a bunch of upsides and no downsides to ARM.

        • flying_sheep@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 months ago

          That’s my impression as well. I’m confused about the “just”. There’s many non-portable devices that don’t have too heavy workloads and that I’d think would benefit from better energy efficiency.

          • banneryear1868@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            Oh yeah the article is about the laptop market, but of course all sort of non-portable devices run on non-x86 platform. I’d even say x86 is the minority unless you reduce it to just desktop workstations.

      • neeshie@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Arm tends to be a lot more power efficient, so you can get better battery life on portable devices.

    • Rednax@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      There is also a sizable market for laptops that do not do much more than log onto a remote desktop. Especially with remote working, that has becomes the perfect middle ground between security, cost, and ease of use. A cheap ARM processor would work perfectly for those machines.

      • banneryear1868@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        I’m a sysadmin and would much rather have a light arm machine to remote in from than a standard Intel laptop.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    8
    ·
    11 months ago

    This is the best summary I could come up with:


    But Intel CEO Pat Gelsinger doesn’t seem worried about it yet, as he said on the company’s most recent earnings call (via Seeking Alpha).

    “Arm and Windows client alternatives, generally, they’ve been relegated to pretty insignificant roles in the PC business,” said Gelsinger.

    Ideally, Arm-based PCs promise performance on par with x86 chips from Intel and AMD, but with dramatically better power efficiency that allows for long-lasting battery life and fanless PC designs.

    Qualcomm’s latest Snapdragon chip for PCs, the 8cx Gen 3 (also called the Microsoft SQ3), appears in two consumer Windows devices.

    Even if Gelsinger is wrong, he’s trying to spin the rise of Arm PCs as a potentially positive thing, saying that Intel would be happy to manufacture these chips for its competitors.

    Right now, TSMC has an effective monopoly on cutting-edge chip manufacturing, making high-end silicon for Qualcomm, Nvidia, AMD, Apple, and (tellingly) Intel itself.


    The original article contains 521 words, the summary contains 149 words. Saved 71%. I’m a bot and I’m open source!

  • gnuplusmatt@startrek.website
    link
    fedilink
    English
    arrow-up
    7
    ·
    11 months ago

    whatever the new architecture ends up being, at some point we will see x86 relegated to a daughter board in the machine while we transition, or x86 will live in a datacenter and you’ll buy time on a “cloud pc” like what microsoft will already sell you in azure

    • terminhell@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      I’ve been saying that MS is likely trying to ditch the NT kernel for a while now. But I forgot about them azure cloud desktop. I can see them rolling out a Chromebook like environment (Linux based) that would hook into a cloud azure full desktop instance. That way, their Surface devices (for example) could be used for basic web browsing stuff on its own, then you could connect to your desktop for everything else.

  • JackSkellington@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    11 months ago

    1 - laptops usually ship windows out of the box 2 - windows ARM has some trouble due to partnerships 3 - not all apps will have equal parity between older arch to ARMs

    Changes are bound to happen. They don’t want to pay for the ARMs fees probably. And if they don’t bring something at Apple Silicon level, it would be an issue to intel: Intel giant producer of CPUs Apple new to laptop/desktop grade cpu designs Kinda shameful

    • Bell@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      11 months ago

      People are going to start to wonder what they have all the Windows OS for when all they do is run a browser. If someone makes a less hassle Linux distro…that runs well on Arm… Well we could finally have some advancement in mobile computing. ChromeOS was almost it but Google made it all cloud and Google only.

      • InFerNo@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 months ago

        If you say browser only, then you are relegating to the cloud, no? Is then Google not doing things right?

  • mothattack@lemmy.ml
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    11 months ago

    Non x86 has been tried at least twice before on windows and failed. While this is certainly the best attempt yet, there is no guarantee of success. Sure would be nice however to get more competition.

  • ik5pvx@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 months ago

    reminds me of “In the world, there is space for 5 computers” or something along those lines

  • j4k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    19
    ·
    11 months ago

    ARM is dead. Anecdotally, apple has the longest history of any company hitching to dead architectures (6502, 68k. Power PC, etc.). The only architecture that apple has hitched to that didn’t totally die is x86, and x86 will die soon to RISC-V. Why would anyone pay royalties to be controlled by ARM when an open alternative exists. RISC-V is the new future that all the old guard are trying their best to delay as long as possible. ARM was sold by the original owners the second RISC-V overcame its major legal hurdles. The new owners are trying to pump as much as possible to minimize their losses in the public stock exchange. Anyone with an ounce of sense can look at the timeline of RISC-V and the sale of ARM to see the real picture without fanboi nonsense.

    • havokdj@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      11 months ago

      RISC-V is still not going to take over x86 for quite a while. As much as I’d love for it to, it’s still going to take some work.

      Give it about 8-10 years and I think that’s when x86 is going to be out the window, and will be an architecture delegated solely for enthusiasts.

      • j4k3@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 months ago

        I agree it will take awhile to completely take over even the low end market, but like there is already a data center running on RISC-V that was in the news cycle a month or two back. Intel has been putting a lot of money into it too because they know the change is coming. We are on the edge of a major shift needed for AI anyways. I think that will be the death knell for x86. The memory and cache bus structures need to change to accommodate tensor math much more efficiently. Why restructure the dying x86 so substantially when it could be done in RISC-V and make most hardware antiquated at the same time to finance the bleeding edge shift. I think The big players will still be on top, except ARM will fade into irrelevance like MIPS. Proprietary/planned obsolescence/exploitation in the digital landscape is a major problem that needs to go away. All the relevant companies have access to reverse engineered hardware from their competitors. Proprietary only exists to exploit end users. RISC-V is a small step in the right direction of restoring the right to fundamental ownership.

      • j4k3@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        11 months ago

        This is not how ISA, fab nodes, or hardware design work at all. ARM is not special. It was just a company that made it easier to put together a bunch of processor blocks and peripherals for a fee and royalty. It was just a convince thing for the trailing edge. Everything ARM can do RISC-V can do as far as ISA. No one is going to pay a royalty when the same thing is free. This is the realm of big money where the choice is obvious. Not to mention, we are on the final node already when it comes to scaling, the progress of the last 40 years has stopped. There is potential in new technologies like computing with light, but silicon lithography will never drop below 3-5 nanometers because that is the end of what physics allows with quantum tunneling effects. We will eventually move past the stone age of computing with silicon. Organic technology is the holy grail, but until a major shift is made, we are at the end of silicon progress despite what all the marketing fools hype and moan about. ARM has no where to go. The people that created it bailed ages ago because the writing was on the wall all the way back then.

        • SimplePhysics@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          11 months ago

          What does RISC-V do that ARM does not with staying revenant post silicon? (Also, chill bro, Organic and Light computing are still in their infancy and we won’t be there for a while)