It negotiates and then switches to 20V to deliver 100W, so it only needs to run 5A through the cables.
Wire gauge requirements are related to current, not power; that's why high-speed trains usually operate at 25000V or 50000V to allow their megawatts of power to be delivered through a ~1cm diameter overhead line.
That said, I still hate USB-C as a power standard. I would have much preferred 2 connectors just supplying 20VDC, no questions asked, with a standard connector, just like 120VAC power works. Pick another connector for the USA low-voltage standard, none of this negotiation fuss, none of this complicated power circuitry, and much harder to get cables wrong. With the complexity of USB-C there are just way too many substandard cables flooding the market, and average Joes don't understand the difference between a 100W-capable cable and a 60W-capable cable and one that is charging-only, one that is USB 3.0, one that is USB 3.2, one that is USB 3.2 + 100W charging and all of that. People just buy "5-star" rated cables online that often don't meet the standard and it's a shitshow.
That would force all power adapters to support a fixed maximum current, which in many cases would mean either wasted materials (consider e.g. shipping a 100W adapter with a smartphone) or unsafe conditions (with a device attempting to pull 5A from an adapter supporting only much less than that).
I'm really happy with being able to plug my laptop into my tiny 5V, 2A power adapter and slowly charging it, yet being able to fast-charge it at home at 100W using the same cable on a larger power adapter.
> That would force all power adapters to support a fixed maximum current, which in many cases would mean either wasted materials (consider e.g. shipping a 100W adapter with a smartphone) or unsafe conditions (with a device attempting to pull 5A from an adapter supporting only much less than that).
No, it would not require any of that. It would require only for cable to signal it's max current capacity.
So cheap cable would allow for say 20v/0.5A (10W), while decent one could signal 2A(40W) or max.
Simplest way of signal would be as the previous USB buses used, via a specified value resistor on one of the pins.
More complex would be signalling the power supply power to the device on the other side, especially if you want dynamic power or reversible charging direction. But USB-PD is generally horrifically complicated for what it does
> It would require only for cable to signal it's max current capacity.
As you say yourself a bit later, you'd need at least two "simple" resistors (one for the cable itself, one for the adapter) – otherwise it's only fixed-cable power supplies (and I really, really hate having to throw away an entire high-quality power supply just because the cable frayed at the laptop end).
> But USB-PD is generally horrifically complicated for what it does
Which parts would you suggest are safe to omit, in a world where you want to be able to charge both small accessories (Bluetooth headphones etc.), smartphones, and large laptops using a single connector?
> As you say yourself a bit later, you'd need at least two "simple" resistors (one for the cable itself, one for the adapter) – otherwise it's only fixed-cable power supplies (and I really, really hate having to throw away an entire high-quality power supply just because the cable frayed at the laptop end).
You could have end device regulate itself by looking at voltage drop, akin to MPPT. And note that simple (sub-10W) devices could opt to implement nothing whatsoever and just get the minimum 20V/0.5A, making small accessories need nothing whatsoever when it comes to extra chips to handle it.
You really need communication only when you start needing features like switchable sink/source function or voluntarily lowering power usage.
>>But USB-PD is generally horrifically complicated for what it does
> Which parts would you suggest are safe to omit, in a world where you want to be able to charge both small accessories (Bluetooth headphones etc.), smartphones, and large laptops using a single connector?
Not talking about featureset but how hideously complex the implementation is. I remember some early adopters tried to implement it using a microcontroller and it ate like 30+kB of code and it was more than half of total product's code
The spec (3.0, rev 1.1) is 574 pages. No fucking wonder we have so many devices implementing it subtly wrong. Hell, they even decided "no, none of the existing serial protocols fits our super special use case, need to invent that too. At least they didn't reinvent CRC32...
I don't understand the implementation, it's pretty nuts. And the multiple layers of optional features and fragmentation are pretty awful. I kinda wish they hadn't specified any power levels other than 5v and full programmable PPS....
I also wish they had implemented a dedicated solar profile to simplify solar generators, so a panel could say "I will act like a current source and do MPPT".
They probably could have done Dallas One Wire for the signalling and been just fine. It could also be how the cable chip communicates, and very small simple devices could use it without the full USB stack.
But, I can accept it just because of how well it works in practice as is.
> That would force all power adapters to support a fixed maximum current
No, we would just start to have both 20VDC sockets on all the walls in the world in addition to 120VAC/240VAC, and people would stop needing to lug around power adapters.
Yes there would be a maximum current per socket, maybe 5A or 10A, that's not much different from the 15A limit on most AC sockets.
The conversion to 20VDC could be be wired into buildings as standard.
Before then people can lug around a 20VDC adapter that supplies enough current for their own devices. If they have a laptop they might need a 10A adapter, if it's just a phone they could get by with 1A.
So you're suggesting we re-wire every single building with these "simple" 20V DC plugs? We'd better get both the voltage and maximum current exactly right, because the last time we picked standards for this, they lasted for several centuries!
And if we do get it wrong, do we change the mechanical plug shape if we decide that no, actually 40V (or 15V, or 32.945V) would have been better so that no unsafe connections become possible?
Sure, but what makes 20V the perfect voltage for almost everything, so important that we should go through the effort of rewiring entire buildings?
It's way too much to charge small accessories (without an internal step-down converter [1]), yet it's probably not enough to sustain (let alone charge) beefy laptops.
> "hey I have a new plug"
You do realize that this is exactly your pitch for these 20V "standardized" plugs/outlets, right? :)
[1] A big reason why USB's 5V is so popular is that it's just the right voltage to charge lithium ion batteries and run simple circuits without expensive (in terms of parts) active voltage converters.
> it's just the right voltage to charge lithium ion batteries
No, it's not. Lithium is nominally 3.7 volts per cell and charging voltage goes up to about 4.2 volts max, maybe 4.3. Definitely never 5V.
A 5V->4.2V buck converter isn't particularly different than a 20V->4.2V buck converter. In fact it's probably a bit easier to make a 20V->4.2V converter since you don't have to worry about parasitic losses as much.
> You do realize that this is exactly your pitch for these 20V "standardized" plugs/outlets, right? :)
Not exactly, (a) I'd use an existing connector, maybe a Lenovo 20V connector (b) USB keeps having to switch things because they weren't a charging standard to begin with, they were abused as a power plug and then had to keep evolving with that while maintaining backward compatibility and continually keeping up with Moore's law in bandwidth requirements.
A 20VDC plug would just be that -- a low-voltage socket for <100W appliances.
Anything higher, just use 120V/240V and get upto 1800W.
You need to have device side protection for that anyway because broken PSU can happen in many ways you can't really predict, including "someone implemented that monster of a standard wrong"
Sure. Since you asked so nicely, here's a datasheet that illustrates a device which integrates overvoltage protection. Plug a malfunctioning 20V USB-C power supply into your phone, and the phone will internally disconnect the USB-C connection to protect itself.
I incinerated 250 seconds of my time, or about a billionth of my life for this answer for you - for free! Hope the little dopamine hit was enough for you today. I won't be back.
A PSU "deciding" to do that is an ultra rare case, and probably dwarfed by PSUs that simply break internally and deliver the wrong voltage, which can happen to any PSU.
The closest thing I've heard of was a cable that pretended it was a device, making it so that unplug events didn't register properly.
The cable must report itself as being capable of 5A, which requires it to use 20-gauge wire for the VBUS.
ETA: I guess your question was specifically about the connector interface, which is less of a problem because there are 4 power and 4 ground pins. 1.25A per pin is no issue.
Resistance is the function of conductivity of the material, cross section area of the conductor, and length. In the case of wires the length is long, so they need more cross sectional area to keep the resistance down, in the contacts the length is very short, so they need less cross sectional area to achieve the same Resistance. Often, they make the contacts from an even more conductive material than copper like gold or silver, so that helps even more.
The electricity is going through the plug for only a few millimeters, and is going through the rest of the cable for 500-2000 millimeters. It's okay for the plug to have more ohms per meter. It's going to have a very tiny voltage drop and emit a very tiny amount of heat.
If you're saying the plugs don't have more resistance than the equivalent length of cable, that's wrong. From what I can find it's very typical for a USB C plug to have 40 milliohms of contact resistance. Even if you divide that by 4, that's 10 milliohms in a single centimeter. 20 gauge wire is 0.3 milliohms per centimeter.
It's often called "necking down". The connector is higher resistance, and will indeed have more resistive loss than the wire. It's a very small length, though. The extra heat is quickly dissipated into the the nearby wire through thermal conduction, so it will be only marginally warmer than the rest of the cable.
This is a very common practice on PCBs. You can use a very wide trace where there's room, and narrow it to get around tight spots like between pins on an IC.
When I built solar race cars in college, we optimized the powertrain for resistive losses over weight. We used 00 AWG wire from the battery to the motor, knowing it would max out at 40 amps. We would then "neck it down" to 8 AWG to go into 40A rated connectors. To a casual observer it definitely appeared ridiculous and wrong.
The bigger issue with this is not the steady state but what happens when the cable is janked out carrying that current. Spark erosion of the contacts etc., the spec goes into quite some detail there.
Or just makes a poor connection to begin with due to out-of-spec cables or connector wear/damage, which is likely considering the much tighter tolerance than old-school, reliable, battle-tested connectors such as barrel jacks.