• 0 Posts
  • 100 Comments
Joined 1 year ago
cake
Cake day: July 11th, 2023

help-circle
  • To clarify; I have a 100W Ugreen Nexode 4 Port USB Charger that I use to charge my laptop (~60W), Steam Deck (~40W), iPhone (~20W) and AirPods (~5?W).

    The problem is if my original product cable has gone walkabout temporarily and I need to use a random one to stand in - there is no clear way of telling if I’m accidentally using a 5W-max cheap cable to try and keep my laptop charged while working.

    Obviously there are some context clues depending on cable thickness etc., but with how common cosmetic braiding is becoming a thing - even that’s getting harder to rely on.


  • Yes, you can. The charger and the device communicate between one another what they can support, and pick the highest one they both agree on.

    E.G. my laptop charger can charge at full speed (100W) for my MacBook, but only at 20W for my iPhone.

    That bit is pretty straightforward and transparent to end users (there are a few rare conditions where devices might not agree on the fastest, and have to fall back to a slower one); the issue is more with cables not having sufficient gauge wire, or missing connections that prevent the charger and device from communicating their full functionality.


  • It’s been more of a pain in the arse than initially expected.

    Most motherboards (for example) only have 2-4 USB-C ports, meaning that I still need to employ A-C and C-C cables for peripherals etc.

    My main gripe is that the standard just tries to do too many things without clear delineation/markings:

    1. Is it a USB 2.0 (480Mbit), 5Gbit, 10Gbit or 20Gbit cable? Can’t really tell from the plug alone.

    2. More importantly, for charging devices: How the heck do I determine maximum wattage I can run?

    For all its faults, at least the blue colour of a USB-3.0 plug (or additional connectors for B/Micro) made it easy to differentiate !

    Now I’m eyeing up a USB Cable tester just to validate and catalogue my growing collection! 🤦🏻‍♂️







  • Voting is simply putting a word in for a candidate to fill in a subordinate role to your company’s HR department.

    You won’t always get the one you pick, so it’s your responsibility to put forward the best candidate (primaries), vouch for them (voting), and keep them accountable to their KPIs (individual lobbying).

    ETA: Remember, you’re basically looking to hire someone to look after you, your family, your friends and your community for the next n years; it also takes a lot longer to fix something, than it did to break it initially, and fixed things still wear the scars of the past.



  • I doing think it was an one thing, but more-so a build-up over time - a death of a thousand cuts, if you will:

    It was a cultural moment generally, just think back to all of those celebrity commercials (“I’m Mr. T and I’m a Night Elf Mohawk”). All cultural moments pass eventually.

    The third expansion (Cataclysm) was quite weak to begin with; coupled with a lack of content in the tail-end of the second (Wrath of the Lich King), which itself was incredible - narratively wrapped up the story that began all the way back in Warcraft 3.

    So a lot of people chose that time to bow out of the game, as it required a fair bit of time dedication and seemed like an appropriate time to do so - given the narrative pay-off.

    Lastly, the introduction of a number of game tools to automate the group composition process meant that the impact of player reputation on servers was severely diminished. Before then, there players who were toxic (stealing items, intentionally killing the group, failing quests) were infamous on a server.

    Once this tool was further opened up to allow for groups to form across multiple servers - the sense of community was shattered as you would have no way to know if the person from another server was good/bad etc. it stopped being about bringing in the individual player, and just getting a body in to fill a role.