• 0 Posts
  • 349 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle




  • I keep seeing this claim, but never with any independent verification or technical explanation.

    What exactly is listening to you? How? When?

    Android and iOS both make it visible to the user when an app accesses the microphone, and they require that the user grant microphone permission to the app. It’s not supposed to be possible for apps to surreptitiously record you. This would require exploiting an unpatched security vulnerability and would surely violate the App Store and Play Store policies.

    If you can prove this is happening, then please do so. Both Apple and Google have a vested interest in stopping this; they do not want their competitors to have this data, and they would be happy to smack down a clear violation of policy.



  • Most of Apple’s history, actually.

    Macs have a reputation for being expensive because people compare the cheapest Mac to the cheapest PC, or to a custom-built PC. That’s reasonable if the cheapest PC meets your needs or if you’re into building your own PC, but if you compare a similarly-equipped name-brand PC, the numbers shift a LOT.

    From the G3-G5 era ('97-2006) through most of the Intel era (2006-2020), if you went to Dell or HP and configured a machine to match Apple’s specs as closely as possible, you’d find the Macs were almost never much more expensive, and often cheaper. I say this as someone who routinely did such comparisons as part of their job. There were some notable exceptions, like most of the Intel MacBook Air models (they ranged from “okay” to “so bad it feels like a personal insult”), but that was never the rule. Even in the early-mid 90s, while Apple’s own hardware was grossly overpriced, you could by Mac clones for much cheaper (clones were licensed third-parties who made Macs, and they were far and away the best value in the pre-G3 PowerPC era).

    Macs also historically have a lower total cost of ownership, factoring in lifespan (cheap PCs fail frequently), support costs, etc. One of the most recent and extensive analyses of this I know if comes from IBM. See https://www.computerworld.com/article/1666267/ibm-mac-users-are-happier-and-more-productive.html

    Toward the tail end of the Intel era, let’s say around 2016-2020, Apple put out some real garbage. e.g. butterfly keyboards and the aforementioned craptastic Airs. But historically those are the exceptions, not the rule.

    As for the “does more”, well, that’s debatable. Considering this is using Apple’s 90s logo, I think it’s pretty fair. Compare System 7 (released in '91) to Windows 3.1 (released in '92), and there is no contest. Windows was shit. This was generally true up until the 2000s, when the first few versions of OS X were half-baked and Apple was only just exiting its “beleaguered” period, and the mainstream press kept ringing the death knell. Windows lagged behind its competition by at least a few years up until Microsoft successfully killed or sufficiently hampered all that competition. I don’t think you can make an honest argument in favor of Windows compared to any of its contemporaries in the 90s (e.g. Macintosh, OS/2, BeOS) that doesn’t boil down to “we’re used to it” or “we’re locked in”.







  • LawnChair is the best option I’ve tried that has a similar design to Nova.

    I’ve tried at least a couple dozen launchers since Nova got bought. Most of them are either half-baked or have a very different design (e.g. based on radial menus or text-only lists). If you’re into minimalism, there are a lot of good options. If you want a full-featured icon grid that behaves more or less like Nova, LawnChair is it.

    I’m running LawnChair 14 Beta now. You can get it off the GitHub. Last I checked, the version on Google Play was very old.



  • Ah, that makes sense! I probably should have split my /home off to its own subvolume. I’ll add that my list of things to think about next time I hop distros or rebuild (which I’m considering once again, because I have Plasma envy).

    And yes, snapshots should NOT be treated as backups. A real backup saves your butt if your drive dies, while a snapshot goes down with that ship. I should really set up a better backup system, but for now I just periodically use Borg to back up to an external HD, and then copy that into an encrypted cloud drive.


  • I love btrfs+snapper. I have automatic snapshots taken before and after every apt install, so if anything ever goes belly-up, no problem, I just roll back.

    It’s a little weird sometimes when I’m running out of disk space, so I delete some big downloads and…I get no disk space back, because those files still exist in old snapshots! I suspect there’s some way to finetune it to ignore certain directories (like ~/Downloads) in snapshots but I haven’t taken the time to dig into it. Anyway, it’s not a huge problem because the automatic snapshots are limited to a certain number, so they eventually get bumped out (or I can delete them manually if needed).

    I haven’t tried bcachefs yet. Perhaps on my next build.



  • Looking over the Fastfox.js config, it looks like most settings fall into one of three categories:

    1. Subjective appearance of speed or responsiveness (perhaps at the expense of objectively-measurable load times)
    2. Experimental options that don’t apply to all hardware or OSes (e.g. GPU acceleration)
    3. Settings that optimize performance at the expense of memory, CPU, or network usage (e.g. cache sizes and connection limits)

    I don’t see anything that makes me think Mozilla’s defaults are unreasonable. It’s not like Mozilla is leaving performance on the table, but rather that they chose a different compromise here and there, and use highly-compatible defaults. That said, it does seem like there is room for individual users to improve on the defaults — particularly if they have fast internet connections and lots of RAM.

    For example:

    // [NOTE] Lowering the interval will increase responsiveness
    // but also increase the total load time.
    user_pref(“content.notify.interval”, 100000); // (.10s); default=120000 (.12s)

    This seems very much like a judgment call and I guess Firefox’s defaults would actually have better objective load times and better benchmark scores. That doesn’t mean it’s objectively better, but it seems reasonable, at least.

    // PREF: GPU-accelerated Canvas2D
    // Use gpu-canvas instead of to skia-canvas.
    // [WARNING] May cause issues on some Windows machines using integrated GPUs [2] [3]

    // [NOTE] Higher values will use more memory.

    Again, the defaults seem to make sense. Perhaps Mozilla could add an optimization wizard to detect appropriate settings for your hardware, and let the user select options like “maximize speed” vs “maximize memory efficiency”. These are not one-size-fits-all settings.

    Fastfox also disables a lot of prefetching options, which…seems counter to the goal of improving speed. Not really sure what to make of that.



  • GenderNeutralBro@lemmy.sdf.orgtoMemes@sopuli.xyzWe're coming for you
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    2 months ago

    There are only a few species of mosquitoes that pose a threat to humans (and several thousand that don’t). If we had a way to effectively eradicate those few species, then it probably wouldn’t have major consequences. They don’t fill an important, unique niche in their ecosystems like, say, bees.

    But we don’t have a way to do that. Not without huge collateral damage from poisons and the like. There’s been some promising work with genetic engineering, releasing mosquitoes that will mate and produce non-viable offspring. This can greatly reduce a local population in the short-term, but they bounce back.