The browser is your thin client.
Mine’s getting pretty thicc
The browser is your thin client.
Mine’s getting pretty thicc
Not sure why companies try to push mobile games like that so fucking hard. Just because everyone has a phone doesn’t mean everyone wants to play games on them.
Mobile games make more revenue than PC and console gaming combined. Of course companies are gonna try to get a bigger and bigger piece of that pie.
Once TSMC Arizona is up and running (probably 2027 or so), that’s going to be a supply chain that goes through entirely friendly countries, not at significant geopolitical risk.
Ok, thanks for talking me through this. I have completely unused HDMI ports on my laptops and I’m genuinely curious about those who are using theirs.
display - USB-C at work, HDMI (through USB-C dock) at home
Obviously you can’t use an HDMI port that you don’t have, but I gotta ask: if you had one of the newer MBPs with built-in HDMI, would you be using that HDMI port? Because it sounds like you wouldn’t, and that you’d still rely on the USB-C dock to do everything.
And that’s been my position this whole thread. I think that the MBP’s return of the HDMI port was greeted with lots of fanfare, but I don’t actually know anyone who switched back to HDMI.
Yeah, I’m not going to throw out perfectly good hardware just to unify cables somewhat.
I was referring to the replacement of HDMI 2.0 stuff with 2.1 stuff - not seeing an advantage to choosing HDMI 2.1 over Thunderbolt. And then there’s the support hell of intermingled HDMI 2.0 and 2.1 stuff, including cables and ports and dongles and adapters.
Either way, I’m still stuck on the idea of direct HDMI use as being so ubiquitous that it warrants being built into a non-gaming laptop that already has Thunderbolt and DP (and USB-PD) support through the preexisting USB-C ports.
Thunderbolt only works for workstations if the monitor supports it
Even if driving multiple monitors over HDMI or DVI or DP or VGA or whatever, the dock that actually connects directly to the laptop is best served with Thunderbolt over USB-C, since we’d expect the monitors and docking station (and power cords and an external keyboard/mouse and maybe even ethernet) to all remain stationary. That particular link in the chain is better served as a single Thunderbolt connection, rather than hooking up multiple cables representing display signal data, other signal data, and power. And this tech is older than HDMI 2.1!
So I’m not seeing that type of HDMI use as a significant percentage of users, enough to justify including on literally every 14" or 16" Macbook Pro with their integrated GPUs. At least not in workplaces.
You use HDMI for all those use cases? Seems like Thunderbolt is a much better dock for workstations, and DisplayPort is generally better for computer monitors and the resolution/refresh rates useful for that kind of work. The broad support of cables and HDMI displays is for HDMI 2.0, which caps at 4k60. By the time HDMI 2.1 hit the market, Thunderbolt and DisplayPort Alt mode had been out for a few years, so it would’ve made more sense to just upgrade to Thunderbolt rather than getting an all new HDMI lineup.
To a second screen, sure. But I’m saying that DisplayPort and Thunderbolt are so much better, are generally supported by more computer monitors (but probably fewer TVs). I’d be surprised that there are a lot of people using HDMI in particular.
Now, I don’t know if it’s in USBC cables
It’s not. Apple specifically follows the USB-PD standard, and went a long way in getting all the other competing standards (Qualcomm’s Quick Charge, Samsung Adaptive Fast Charge) to become compatible with USB-PD. Now, pretty much every USB-C to USB-C cable supports USB-PD.
Also a shout out to Google Engineer Benson Leung who went on a spree of testing cables and wall adapters for compliance with standards after a charger set his tablet on fire. The work he did between 2016-2018 went a long way in getting bad cables taken off the market.
Are people connecting their laptops to TVs frequently enough that this should be built into every single unit shipped? I can’t imagine the percentage of users who actually use their HDMI ports is very high.
HDMI is a dogshit standard and everyone should’ve moved over to DisplayPort or Thunderbolt over the USB-C form factor.
Do you mean Lisa Frank, the artist for colorful animals on school supplies, and not Anne Frank, the famous diarist who was killed by the Nazis during the Holocaust?
The problem is that there are too many separate dimensions to define the tiers.
In terms of data signaling speed and latency, you have the basic generations of USB 1.x, 2.0, 3.x, and 4, with Thunderbolt 3 essentially being the same thing as USB4, and Thunderbolt 4 adding on some more minimum requirements.
On top of that, you have USB-PD, which is its own standard for power delivery, including how the devices conduct handshakes over a certified cable.
And then you have the standards for not just raw data speed, but also what other modes are supported, for information to be seamlessly tunneled through the cable and connection in a mode that carries signals other than the data signal spec for USB. Most famously, there’s the DisplayPort Alt Mode for driving display data over a USB-C connection with a DP-compatible monitor. But there’s also an analog audio mode so that the cable and port passes along analog data to or from microphones or speakers.
Each type of cable, too, carries different physical requirements, which also causes a challenge on how long the cable can be and still work properly. That’s why a lot of the cables that support the latest and greatest data and power standards tend to be short. A longer cable might be useful, but could come at the sacrifice of not supporting certain types of functions. I personally have a long cable that supports USB-PD but can’t carry thunderbolt data speeds or certain types of signals, but I like it because it’s good for plugging in a charger when I’m not that close to an outlet. But I also know it’s not a good cable for connecting my external SSD, which would be bottlenecked at USB 2.0 speeds.
So the tiers themselves aren’t going to be well defined.
Everything defined in the Thunderbolt 3 spec was incorporated into the USB 4 spec, so Thunderbolt 3 and USB 4 should be basically identical. In reality the two standards are enforced by different certification bodies, so some hardware manufacturers can’t really market their compliance with one or the other standard until they get that certification. Framework’s laptops dealt with that for a while, where they represented that their ports supported certain specs that were basically identical to the USB 4 spec or even the Thunderbolt 4 spec, but couldn’t say so until after units had already been shipping.
They chiplet past 500
I don’t know if I’m using the right vocabulary, maybe “die size” is the wrong way to describe it. But the Ultra line packages two Max SoCs with a high performance interconnect, so that the whole package does use about 1000 mm^2 of silicon.
My broader point is that much of Apple’s performance comes from their willingness to actually use a lot of silicon area to achieve that performance, and it’s very expensive to do so.
Apple does two things that are very expensive:
Those are business decisions that others simply can’t afford to follow.
Well, specifically, they’re promising battery life that beats Qualcomm’s implementation of an ARM laptop SoC.
Qualcomm is significantly behind Apple. I’m not convinced that the ISA matters all that much for battery life. AMD’s x86_64 performance per watt blew Intel’s out of the water in recent generations, and Qualcomm/Samsung’s ARM chips can’t compete with Apple’s ARM chips in the mobile, tablet, or laptop space.
To be honest, no. I mainly know about JPEG XL only because I’m acutely aware of the limitations of standard JPEG for both photography and high resolution scanned documents, where noise and real world messiness cause all sorts of problems. Something like QOI seems ideal for synthetic images, which I don’t work with a lot, and wouldn’t know the limitations of PNG as well.
You say that it is sorted in the order of most significants, so for a date it is more significant if it happend 1024, 2024 or 9024?
Most significant to least significant digit has a strict mathematical definition, that you don’t seem to be following, and applies to all numbers, not just numerical representations of dates.
And most importantly, the YYYY-MM-DD format is extensible into hh:mm:as too, within the same schema, out to the level of precision appropriate for the context. I can identify a specific year when the month doesn’t matter, a specific month when the day doesn’t matter, a specific day when the hour doesn’t matter, and on down to minutes, seconds, and decimal portions of seconds to whatever precision I’d like.
I wonder if someone could set up some form of tunneling through much more mundane traffic, perhaps even entirely over a legitimate encrypted service through a regular browser interface (like the browser interface for services like Discord or slack or MS Teams or FB Messenger or Zoom or Google Chat/Meet) where you can just literally chat with a bot you’ve set up, and instruct the bot to do things on its end, and then forward the results through file sending in that service. From the outside it should look like encrypted chat with a popular service over that https connection.