L4sBot@lemmy.worldMB to Technology@lemmy.worldEnglish · 1 year ago2023 was the year that GPUs stood stillarstechnica.comexternal-linkmessage-square90fedilinkarrow-up1278arrow-down18file-text
arrow-up1270arrow-down1external-link2023 was the year that GPUs stood stillarstechnica.comL4sBot@lemmy.worldMB to Technology@lemmy.worldEnglish · 1 year agomessage-square90fedilinkfile-text
2023 was the year that GPUs stood still::A new GPU generation did very little to change the speed you get for your money.
minus-squareVerdant Banana@lemmy.worldlinkfedilinkEnglisharrow-up21arrow-down2·1 year agointel GPUs definitely won out for what you get for the money
minus-squarethat guy@lemmy.worldlinkfedilinkEnglisharrow-up20·1 year agoThat’s not a sentence I’m used to seeing
minus-squareCalcProgrammer1@lemmy.mllinkfedilinkEnglisharrow-up10arrow-down1·1 year agoI’ve been very happy with my Arc A770, it works great on Linux and performs well for what I paid for it.
minus-squarebarsoap@lemm.eelinkfedilinkEnglisharrow-up1·1 year agoHave you tried ML workloads, differently put: How is compatibility with stuff that expects CUDA/ROCm? Because the A770 is certainly the absolutely cheapest way to get 16G nowadays.
minus-squareCalcProgrammer1@lemmy.mllinkfedilinkEnglisharrow-up1·1 year agoNo, I don’t use any ML stuff or really anything that uses GPU compute at all. I just use it for gaming and other 3D applications.
intel GPUs definitely won out for what you get for the money
That’s not a sentence I’m used to seeing
I’ve been very happy with my Arc A770, it works great on Linux and performs well for what I paid for it.
Have you tried ML workloads, differently put: How is compatibility with stuff that expects CUDA/ROCm? Because the A770 is certainly the absolutely cheapest way to get 16G nowadays.
No, I don’t use any ML stuff or really anything that uses GPU compute at all. I just use it for gaming and other 3D applications.