TL;DR: The USB Implementers Forum is ridiculously bad at naming, symbols and communication in general. (And they don’t seriously enforce any of this anyway, so don’t even bother learning it.)
This is the correct answer; after the whole USB 3.2 Gen 2 2x2 (hands of blue) bullshit, I wouldn’t trust that team to name a park bench in the middle of the desert. Let alone something important and universally used.
The bench is called “Bench” (legacy name, it’s actually more like a concrete slab, but at the time it was more benchy that the previous bench which was just a pile of sand).
the whole USB 3.2 Gen 2 2x2 (hands of blue) bullshit
If you’re not trying to wire your own USB port you can just use the recommended names “USB SuperSpeed 20 Gbps” or “USB 20 Gbps”. You don’t have to be confused by technical names if you don’t want to be.
The real bullshit is between your ears–you and only you can fix it.
They are not bad at this. You are bad at understanding it.
I work with this stuff, and I do understand it. Some of my colleagues are actively participating in USB-IF workgroups, although not the ones responsible for naming end user facing things. They come to me for advice when those other workgroups changed some names retroactively again and we need to make sure we are still backwards compatible with things that rely on those names and that we are not confusing our customers more than necessary.
That is why I am very confident in claiming those naming schemes are bad.
“don’t even bother learning it” is my advice for normal end users, and I do stand by it.
But the names are not hard if you bother to learn them.
Never said it is hard.
It is more complex than it needs to be.
It is internally inconsistent.
Names get changed retroactively with new spec releases.
None of that is hard to learn, just not worth the effort.
They’re bad because manufacturers want to pass their usb 2.0 gear as “usb 3.0 compliant”, which it technically is, and their usb 3.0 gear as “usb 3.2” because 3.2 Gen 1x1 is also 5gbps.
Also the whole alternate mode is awesome, but cheap hub chips don’t bother trying to support it and the only people who do are the laptop ports so they can save $.40 on a separate hdmi port.
And don’t get me started on all the USB-c chargers that only put out 1.5a because it’s just a normal 7805 on the back end.
They’re bad because manufacturers want to pass their usb 2.0 gear as “usb 3.0 compliant”, which it technically is, and their usb 3.0 gear as “usb 3.2” because 3.2 Gen 1x1 is also 5gbps.
The USB X.X is just the version of the standard and doesn’t mean anything for the capabilities of a physical device.
When a new standard comes out it superceeds the old one. Devices are always designed and certified according to the current standard.
A device or port that does 480mbps transfer speeds is a “Hi-Speed” device/port. That’s the real name and always has been.
It doesn’t matter what version of the USB spec it was certified under. If it was designed between 2000 and 2008 it was certified under USB 2.0 or 2.1
If that device was certified between 2008 and 2013 then it was certified under USB 3.0. That absolutely doesn’t make it a “SuperSpeed” device/port, but that’s more than clear when we use the real names.
Nobody uses that, they use the spec number because that’s what they’ve been taught, and they identify with it more than the incredibly stupid ‘full/high/super/duper/ultramegahyperspeed’ convention which the idiots at the siig decided to break again in 3.2.
Everybody literally on the planet agrees the system is moronic, you’re literally the only person who dissents, congratulations on that.
There is some stuff to be learned, but especially with USB-C I’d say the vast majority are not labeled. There’s even some devices charged with USB C that can’t be charged with a PD charger and need an A to C cable. Phones are a great example where you have to look up the specs to know data transfer capabilities. Additionally they renamed the USB 3.0 standard which has been established for over a decade to USB 3.1 Gen 1 which is completely unnecessary and just serves to confuse. The standard was largely understandable with USB 3.0 generally being blue or at least a color other than black and on decently modern devices USB 2.0 would be black. With USB-C indication has just about gone out the window and what used to be a very simple to understand standard has now become nearly impossible to understand without having researched every device and cable you interact with.
There’s even some devices charged with USB C that can’t be charged with a PD charger and need an A to C cable
Phones with qualcomm chips briefly had their own proprietary fast charging standards that were not a USB standard. You are unlikely to be using those devices in 2024. But is it USB-IF’s fault manufacturers tried to create proprietary standards to collect royalties?
Additionally they renamed the USB 3.0 standard which has been established for over a decade to USB 3.1 Gen 1 which is completely unnecessary and just serves to confuse
No they didn’t?
The 5Gbps transfer rate introduced in 2008 is called “Superspeed” and it always has been.
USB X.X is not a port or a transfer speed. It’s the standard (ie a technical whitepaper). The standard is updated as time marches on and new features are added.
The standard was largely understandable with USB 3.0 generally being blue or at least a color other than black and on decently modern devices USB 2.0 would be black.
This was never a requirement, but it was nice to know which Type-A ports had 8 pins vs 4-pins.
With USB-C indication has just about gone out the window and what used to be a very simple to understand standard has now become nearly impossible to understand without having researched every device and cable you interact with.
For the most part you just plug it in and it works. If you need something specific like an external GPU connection, you can’t use your phone charging cable, sure. Is that really that big of a deal?
Yes, it absolutely is USB-IF’s fault that they are not even trying to enforce some semblance of consistency and sanity among adopters. They do have the power to say “no soupcertification for you” to manufacturers not following the rules, but they don’t use it anywhere near aggressively enough. And that includes not making rules that are strict enough in the first place.
TL;DR: The USB Implementers Forum is ridiculously bad at naming, symbols and communication in general. (And they don’t seriously enforce any of this anyway, so don’t even bother learning it.)
This is the correct answer; after the whole USB 3.2 Gen 2 2x2 (hands of blue) bullshit, I wouldn’t trust that team to name a park bench in the middle of the desert. Let alone something important and universally used.
It basically gets longer every few years. At this rate, it’ll turn into an Amazon listing.
USB 3.5 Gen 3 2x2 20 Gbit Two-Sided DP PD USB 3 USB 2 USB 1 Compatible
The bench is called “Bench” (legacy name, it’s actually more like a concrete slab, but at the time it was more benchy that the previous bench which was just a pile of sand).
If you’re not trying to wire your own USB port you can just use the recommended names “USB SuperSpeed 20 Gbps” or “USB 20 Gbps”. You don’t have to be confused by technical names if you don’t want to be.
The real bullshit is between your ears–you and only you can fix it.
“Just plug your device in, you little bitch”
They are not bad at this. You are bad at understanding it.
Don’t get mad when you could instead learn something.
Yes it gets complex. It’s a 25-year old protocol that does almost everything. Of course it will be.
But the names are not hard if you bother to learn them.
I work with this stuff, and I do understand it. Some of my colleagues are actively participating in USB-IF workgroups, although not the ones responsible for naming end user facing things. They come to me for advice when those other workgroups changed some names retroactively again and we need to make sure we are still backwards compatible with things that rely on those names and that we are not confusing our customers more than necessary.
That is why I am very confident in claiming those naming schemes are bad.
“don’t even bother learning it” is my advice for normal end users, and I do stand by it.
Never said it is hard.
It is more complex than it needs to be.
It is internally inconsistent.
Names get changed retroactively with new spec releases.
None of that is hard to learn, just not worth the effort.
They’re bad because manufacturers want to pass their usb 2.0 gear as “usb 3.0 compliant”, which it technically is, and their usb 3.0 gear as “usb 3.2” because 3.2 Gen 1x1 is also 5gbps.
Also the whole alternate mode is awesome, but cheap hub chips don’t bother trying to support it and the only people who do are the laptop ports so they can save $.40 on a separate hdmi port.
And don’t get me started on all the USB-c chargers that only put out 1.5a because it’s just a normal 7805 on the back end.
The USB X.X is just the version of the standard and doesn’t mean anything for the capabilities of a physical device.
When a new standard comes out it superceeds the old one. Devices are always designed and certified according to the current standard.
Soooo…What are you talking about?
I’m talking about using the standard traditionally to denote the performance of the connection.
You don’t go around talking about your “Usb 3.0 device” that runs at 480mbps unless you’re trying to be a massive dickhole.
That’s what I’m talking about.
A device or port that does 480mbps transfer speeds is a “Hi-Speed” device/port. That’s the real name and always has been.
It doesn’t matter what version of the USB spec it was certified under. If it was designed between 2000 and 2008 it was certified under USB 2.0 or 2.1
If that device was certified between 2008 and 2013 then it was certified under USB 3.0. That absolutely doesn’t make it a “SuperSpeed” device/port, but that’s more than clear when we use the real names.
Nobody uses that, they use the spec number because that’s what they’ve been taught, and they identify with it more than the incredibly stupid ‘full/high/super/duper/ultramegahyperspeed’ convention which the idiots at the siig decided to break again in 3.2.
Everybody literally on the planet agrees the system is moronic, you’re literally the only person who dissents, congratulations on that.
Then just be as mad as you want–that’s the whole point of the news cycle anyways! Why bother learning? Congrats, chaos wins!
Can you give a specific example of this?
I’d love to believe all your ethos arguments if you could give me some logos.
There is some stuff to be learned, but especially with USB-C I’d say the vast majority are not labeled. There’s even some devices charged with USB C that can’t be charged with a PD charger and need an A to C cable. Phones are a great example where you have to look up the specs to know data transfer capabilities. Additionally they renamed the USB 3.0 standard which has been established for over a decade to USB 3.1 Gen 1 which is completely unnecessary and just serves to confuse. The standard was largely understandable with USB 3.0 generally being blue or at least a color other than black and on decently modern devices USB 2.0 would be black. With USB-C indication has just about gone out the window and what used to be a very simple to understand standard has now become nearly impossible to understand without having researched every device and cable you interact with.
Phones with qualcomm chips briefly had their own proprietary fast charging standards that were not a USB standard. You are unlikely to be using those devices in 2024. But is it USB-IF’s fault manufacturers tried to create proprietary standards to collect royalties?
No they didn’t?
The 5Gbps transfer rate introduced in 2008 is called “Superspeed” and it always has been.
USB X.X is not a port or a transfer speed. It’s the standard (ie a technical whitepaper). The standard is updated as time marches on and new features are added.
This was never a requirement, but it was nice to know which Type-A ports had 8 pins vs 4-pins.
For the most part you just plug it in and it works. If you need something specific like an external GPU connection, you can’t use your phone charging cable, sure. Is that really that big of a deal?
Yes, it absolutely is USB-IF’s fault that they are not even trying to enforce some semblance of consistency and sanity among adopters. They do have the power to say “no
soupcertification for you” to manufacturers not following the rules, but they don’t use it anywhere near aggressively enough. And that includes not making rules that are strict enough in the first place.