Full Ublock is a mixed bag on mobile because it eats battery/performance, and (if you add all the same filter sources), integrated blockers like Orion’s are just about the same anyway.
Full Ublock is a mixed bag on mobile because it eats battery/performance, and (if you add all the same filter sources), integrated blockers like Orion’s are just about the same anyway.
Oh heck yeah. I’ve been using it on iOS a ton, and dying for this on Windows/Linux.
Fun trivia: what browser supports HEIFs, JPEG XL AVIF, AV1, all with correctly rendered HDR?
Not Chrome. And not Firefox, nor anything based on them I’ve tried: https://caniuse.com/?search=image+format


I like Windows 11. But only as a thoroughly neutered, disposable “secondary” OS to dual boot with Linux, to the extent that I could wipe my Windows partition without a care.
If I had to use Windows 11 as my only OS, I’d pull my hair out. Same with desktop Linux TBH. There’s stuff that’s just painful in both ecosystems.


Apple’s media support is incredible.
I have one platform where HDR photos/video playback and editing, JpegXL, HEIFs from my camera and such all just work. And it’s definitely not my KDE desktop, nor Windows 11.


Yeah, probably. I actually have no idea what they charge, so I’d have to ask.
It’s be worth it for a 3090 though, no question.


This doesn’t make any sense, especially the 2x 3090 example. I’ve run my 3090 at PCIe 3.0 over a riser, and there’s only one niche app where it ever made any difference. I’ve seen plenty of benches show PCIe 4.0 is just fine for a 5090:
https://gamersnexus.net/gpus/nvidia-rtx-5090-pcie-50-vs-40-vs-30-x16-scaling-benchmarks
1x 5090 uses the same net bandwidth, and half the PCIe lanes, as 2x 3090.
Storage is, to my knowledge, always on a separate bus than graphics, so that also doesn’t make any sense.
My literally ancient TX750 still worked fine with my 3090, though it was moved. I’m just going to throttle any GPU that uses more than 420W anyway, as that’s ridiculous and past the point of diminishing returns.
And if you are buying a 5090… a newer CPU platform is like a drop in the bucket.
I hate to be critical, and there are potential issues, like severe CPU bottlenecking or even instruction support. But… I don’t really follow where you’re going with the other stuff.


That’s a huge generalization, and it depends what you use your system for. Some people might be on old threadripper workstations that works fine, for instance, and slaps in a second GPU. Or maybe someone needs more cores for work; they can just swap their CPU out. Maybe your 4K gaming system can make do with an older CPU.
I upgraded RAM and storage just before the RAMpocalypse, and that’s not possible on many laptops. And I can stuff a whole bunch of SSDs into the body and use them all at once.
I’d also argue that ATX desktops are more protected from anti-consumer behavior, like soldered price-gouged SSDs, planned obsolescence, or a long list of things you see Apple do.
…That being said, there’s a lot of trends going against people, especially for gaming:
There’s “initial build FOMO” where buyers max out their platform at the start, even if that’s financially unwise and they miss out on sales/deals.
We just went from DDR4 to DDR5, on top of some questionable segmentation from AMD/Intel. So yeah, sockets aren’t the longest lived.
Time gaps between generations are growing as silicon gets more expensive to design.
…Buyers are collectively stupid and bandwagon. See: the crazy low end Nvidia GPU sales when they have every reason to buy AMD/Intel/used Nvidia instead. So they are rewarding bad behavior from companies.
Individual parts are more repairable. If my 3090 or mobo dies, for instance, I can send it to a repairperson and have a good chance of saving it.
You can still keep your PSU, case, CPU heating, storage and such. It’s a drop in the bucket cost-wise, but it’s not nothing.
IMO things would be a lot better if GPUs were socketable, with LPCAMM on a motherboard.


Yeah, that’d be great. Peltiers would be awesome and everywhere if they were dirt cheap.


Awesome, thanks for the info and source.
Yeah, most of my frustration came from JXL/AVIF/HEIF and how linux/Windows browsers, KDE, and Windows 11 don’t seem to support them well. Not a fan of packing HDR into 8-bits with WebP/JPG, especially with their artifacts, though I haven’t messed with PNG yet.


Also, we haven’t even got HDR figured out.
I’m still struggling to export some of my older RAWs to HDR. Heck, Lemmy doesn’t support JPEG XL, AVIF, TIFF, HEIF, nothing, so I couldn’t even post them here anyway. And even then, they’d probably only render right in Safari.


8K is theoretically good as “spare resolution,” for instance running variable resolution in games and scaling everything to it, displaying photos with less scaling for better sharpness, clearer text rendering, less flickering, stuff like that.
It’s not worth paying for. Mostly. But maybe some day it will be cheap enough to just “include” with little extra cost, kinda like how 4K TVs or 1440p monitors are cheap now.
I hear this all the time.
Yet when I bring up features that don’t work at all on X because it’s ancient, “no, thats superfluous. No one needs that.”
I used to do this, but literally just switched to discrete Nvidia yesterday.
Zero issues so far. TBH it actually fixed issues I had with HDR and video decoding on my AMD IGP.


Depends how much you have to pay attention.
First off, I am not a fitness expert. YMMV.
But sometimes I do variations of bodyweight exercises in front of a TV, yes.
One day, for example, might be arm day. I sit and do leg curls for biceps. I straight pushups or tricep dips, use a pull-up bar if I have one; even just hanging is great.
Another day might be push up variation day; wide, narrow, inclined different ways, push up and “reach to the sky with one arm,” knee pushups at the end.
Yet another is leg day. Squats, jumping squats, lunges, butt kicks, heel lifts, other positions to get different muscles. Another day may be core, another day is more shoulder/back, and so on. And all this is without weights, or with at most like a dumbbell or a pull up bar, and some kind of chair or bed for certain positions.
Your eyes will drift away from the TV, and you get exhausted doing this stuff, but you can keep up with a show if you want.


Yeah I was being casual, and I’m not an expert by any means.
I bring it up because, for me, sets of specific bodyweight exercises (like legs one day, shoulders/back another, and so on) is just more time efficient. It gives enough resistance to get sore, and gets me exhausted, all in one setting, instead of running separately. It’s easier on my knees, with no risk of shin splints and less risk of injury than heavy weights.


Come on, you know what I mean. It’s an indicator exerting yourself. Your blood vessels dilate when you’re hot to try and dump the heat, just like they constrict in parts when cold to save it.


Sweat is not a bad thing. It means your heart is pumping; what you want for weight loss.
That being said, I love exercising in cold weather, if you’re somewhere where you get any. Warm up a little inside, go out, and it just feels fantastic.
And that doesn’t just mean running a marathon. It can be calisthenics in a back yard, or garage, or even just walking out to a spot where you can jog.
While I’m here, let me glaze bodyweight exercises, like push ups, squats, kicks, core stuff, and all the variants. Do them in sets, one “group” a day.
It’s amazingly efficient. It gets you out of breath like running, but gets muscles sore like a weight machine, all in less time. And it’s waaay less stressful on your body than running or big weights.


Yeah, I’m not against the idea philosophically. Especially for security. I love the idea of containerized isolation.
But in reality, I can see exactly how much disk space and RAM and CPU and bandwidth they take, heh. Maintainers just can’t help themselves.


Always has been.
I think they meant background transcoding while using the browser.
I don’t even want to speculate on what’s going wrong there, heh. But I can definitely see that being a quirk.