Why not use the already open displayPort and make it better.
noo we need yet another standard!
This was exactly what I wanted to post… 😅
Displayport is an open standard in name only. The specs require membership in VESA, something that requires a hefty sum of money. Even open-source projects have to restrict code that implements Displayport because of the licensing restrictions imposed on the “open” standard.
Lock-in.
This must be for commercial displays where it is beneficial for installation to have power and data over a single cable.
I can’t think why I would want power delivery to my PC monitor over the display cable. It would just put extra thermal load on the GPU.
I think it’s aimed at TVs in general, not computer monitors. Many people mount their TVs to the wall, and having a single cable to run hidden in the wall would be awesome.
I wonder what the use case is for 480W though. Gigantic 80" screens generally draw something like 120W. If you’re going bigger than that, I would think the mounting/installation would require enough hardware and labor that running out a normal outlet/receptacle would be trivial.
Gigantic 80" screens generally draw something like 120W
In HDR mode they can draw a lot more than that for short peaks
My 50" 1080p LCD draws over 200w…
Headroom and safety factor. Current screens may draw 120w, but future screens may draw more, and it is much better to be drawing well under the max rated power.
Sound for an 80" screen? Not for home systems.
Projector
Most OLED HDR TVs peak at over 300W.
Even in that scenario it will complicate the setup. Now your Roku will also have to power your TV? No, any sane setup will have a separate power cable for the TV.
I don’t think you’d ever have a peripheral power the tv. The use case I’m envisioning is power and data going to the panel via this single connector from a base box that handles AC conversion, as well as input (from Roku etc) and output (to soundbar etc.). Basically standardizing what some displays are already doing with proprietary connectors.
In wall power cables need to be rated for it to prevent fire risks. This will need to have thick insulation or be made of a fire resistant material.
Nah, it’s for powering the 1000w RTX 6090.
The popular use for power delivery through a display cable is charging a laptop from your monitor; it’s already very common with Thunderbolt or USB-4 monitors. But 480W seems a bit overkill for that.
It would just put extra thermal load on the GPU.
Passing power through doesn’t have to put noticeable load on the GPU. The main problem I see there is getting even more power to the GPU - Nvidia’s top cards are already at the melting point for their power connector.
Passing power through doesn’t have to put noticeable load on the GPU.
I specifically said thermal load. Power delivery always causes heat dissipation due to I2R losses.
That’s what I meant. Compared to the power the GPU is actually using, transmission losses for a pass-through should be negligible. If you have a good way to get it to the card in the first place.
~~Why is that better than usb-c? ~~
Wait… Power the other way. Whoops, I get it.
That already kinda allow this and the actual load is pretty small
Even a big 30 in display is maybe 20 watts
Well, power delivery goes several times that. Laptops are another very useful case for it. It’s nice to be able to just have a single display port and power connector
You can do this to an extent, today
Thought of this too, with the addition “so we can control that market”.
Today I learned DidplayPort 2.1 can carry 240W.
They fixed it.
Running that much power next to a data line sounds like a terrible idea for signal integrity, especially if something shorts to said data lines. It just sounds sketchy or filled with so many asterisks that it’s functional impossible to reach their claimed throughput.
It’s likely dc current which without the alternating magnetic fields will not degrade the signal as bad. But I whole heartedly agree with you on power delivery. What could possibly need/use that much power‽
The option to run one cable to the monitor, or reversely charge your laptop with one docking cable.
Maybe you could use this to daisy chain monitors and power them all.
The option to run one cable to the monitor, or reversely charge your laptop with one docking cable.
USB-C docks can already do this. Obviously with less power and it’s not perfect by any means, but we don’t need another technology for this. And sure, it’s two cables, one from wall outlet to integrated dock/monitor and usb-c from dock to laptop, but no matter the technology you still need something to plug in to wall outlet.
Yeah, considering the recent VGA power connectors problems, what could possibly go wrong?
wHy Is mY tV sMoKiNg?!?42??
Why is my Temoo/Wi$h bargain cable melting?
GPU power connectors run at very very low voltages - just 12V. And you need to have ridiculously beefy connectors and wires to run high loads at 12V. At 48V you can have 4x more power with the same wire (if insulation is rated for 48V).
Bigass showroom screens I suppose? Maybe large sound systems?
its super nice to plug a laptop into a screen and have the cable double as a charging cable for the laptop
Yeah, agreed. But 480 watts‽
um, gaming laptop maybe?
Honestly no idea, you have a good point.
USB standard is up to what, 40Gbps and 240W? That’s pushing the envelope already. We’ll see if this new standard can prove itself, anyways.
USB4v2 can do 80Gbps and 240W.
It can also do 120Gbps/40Gbps asymmetric.
See, IDK anything about data and power and cables but I dislike the vibe when I dock my laptop with that itty bitty USB-C connector that does power and 2x monitors and networking and peripherals.
I did buy the bonkers expensive proper cable from lenovo, and it does generally just work, but maybe once every few weeks I have to unplug & re-plug.
More power and more data through the same cable just seems daft.
Loved automobiles with 4 wheels? Chinese cars have 13! In your face suckers!
Even an 80” tv only uses around 150W, if my research is correct. Surely this must be thinking about massive displays.
If you’re gonna release a new standard, may as well have the headroom for future growth so it’s not outdated too soon in the future.
Your research would be incorrect
Yeah it was a quick google search. Do you have better numbers available?
Most manufacturers only list average power draw, but in HDR mode you can get much higher peak power useage.
This website also lists peak power draw for many TVs, in this example the Bravia 9 85 inch has a peak of 380W
https://www.displayspecifications.com/en/model-power-consumption/fca71198
Ah perfect, that makes a lot more sense to me
Now you can use one cable for two 80".
Not really that impressive since it seems to be about four times as wide as USB-C
Won’t this heat up like a mother fucker
It depends on the voltage used. If they run 48V which seems to be supported by USB-C EPR. Then the cable has to do the same 5A it’s capable of doing today. Then the heat is the same.
When it comes to their own new connector/cable they can use even higher voltage or more/thicker conductors for power.
If it’s physically more stable and reliable than HDMI, then count me in
Power delivery by itself could be a useful standard for ebike and power station charging (battery to battery charging too). 480w is most I’ve seen, but maybe USB is working on better, or 240w and more flexible/cheaper cables can work. HDMI providing 54v output would be great for most common battery system charging, and dual/triple BMSs for 2x and 3x ports/charging would be awesome.
We already have alternative, it’s called thunderbolt port.
No, we don’t. Apple proprietary nonsense isn’t worth the metal it’s made of.
If it’s not usb-c it’s banned in EU. Because we stopped there and we won’t go forward.
In case anyone is wondering, yes, this is utter nonsense. The EU made USB-C mandatory only as a charger for portable devices like phones, tablets, headphones and mice. That’s all. This new standard, unwelcome as it is, has nothing to do with charging phones so there’s no reason why it can’t be used in the EU.
But let’s not allow measley facts get in the way of having a moan at nothing, shall we? Fucking EU. Forcing us to [checks notes] charge all out things using a single connector, reducing e-waste, and, uh, ensuring there’s lots of futureproofing built-in. BASTARDS.
the GPMI cable comes in two flavors — a Type-B that seems to have a proprietary connector and a Type-C that is compatible with the USB-C standard
I actually copied this from the article to come here to the comments and have a whinge about all the different USB-C standards, and here you are explaining the reason why.
The whole point of USB-C is that it’s a standardised connector that allows anyone to shoehorn their own protocol down it if they want using Alt Mode. Moreover, they can do that without breaking compatibility with other USB-C - or even just specific features - if one of the devices doesn’t speak their crazy-ass moon protocols. This is a benefit of USB-C, not a failing.
Don’t get so excited. Read my comment again.
Please don’t make stuff up.
Other stuff isn’t banned and the law already has allowances for emerging standards.
I think you could have a second connector in addition to a main USBC.
Honestly we need higher capacity for screen cables for PC. Both HDMI and display port are limiting performance because of their low, 40-80gbps, bandwidth. Their performance maxes out at 4k120hz with uncompressed HDR color. You can’t use 8k screens or multiple 4k screens without lowering quality.
Where I work, everyone has 2 4k screens. You can use two cables to connect them, you know…
And every one of them has either put their scaling up to 150% or simply set them to 2k, because you cannot read a damn thing on them.
More than 4k is a theoretical need for a veeeery small market
I disagree with the more than 4K being a theoretical need thing but, regardless, where I work, every desk has a pair of 4K monitors that connect to the user’s laptop via a single USB-C cable. That cable also connects a keyboard, mouse, gigabit ethernet and, depending on the desk, 10Gb ethernet, multiple cameras and conference audio. The cable also charges the laptop, of course. At the moment that’s mostly done using USB-C docking stations, but we’ve started to deploy monitors that are USB-C native and can be daisychained together.
Graphics cards only come with one HDMI port though. The LG OLED is popular for 4k screens because it ticks all the boxes and is much cheaper than equivalent gaming monitors, but that means it doesn’t support dp.
And it means that you have to upgrade the graphics card just for the cable even if it is still relatively new. The point is that we shouldn’t be held back by just a cable .
My graphics card has 3 HDMI ports and two DP ports… You cannot use all at once, but three screens are supported simultaneously…
Graphics cards come with as many ports as the manufacturer wants them to. My home PC’s GPU has two HDMI and two MiniDisplayPort. Also, there are cheap lossless adapters that will convert between MiniDisplayPort, DisplayPort, HDMI, DVI, etc, etc.
Actually? I don’t know much about that legislation. Does it really not have room built-in for tech improvements?
It does! If there’s a good alternative it can be proposed, or that’s what I read here on Lemmy
Also, one of the reasons the EU waited for USB-C is that it specifically supports Alt Mode, which allows non-USB-standard protocols - like this new video connector thing - to be encapsulated within it.