Nope... it's even more complicated. Up to "3.2 Gen 2x1" they have 8 wires, but "3.2 Gen2x2" has 12 wires. There's two pairs of RX and TX each, i.e. RX1- RX1+ RX2- RX2+ TX1- TX1+ TX2- TX2+ D- D+ PWR GND = 12.
And... that's still not correct: USB-C to USB-C cables have an additional CC line which carries the Type-C extra negotiation (e.g. alt mode switch, power levels.)
And just to fuck users over even more, there is a difference between a 13-pin full-function "USB 3.2 Gen2x2" cable and a 13-pin full-function Thunderbolt cable: the Thunderbolt cable reports an extra "vendor descriptor" on the Type-C negotiation channel, indicating support for TB. The cables are otherwise physically identical, but these extra bits come with a ~$15 markdown.
(Oh and the PWR/GND lines can be 3A or 5A, but at this point...)
The article is, however, correct in its conclusion: a "Thunderbolt" cable will always work for all USB standards. The only free variable on a TB cable is whether it does 3A or 5A for power.
USB-C cables either have all 4 superspeed lanes wired up, or none of them. Which is why USB-C docks with video worked (2 lanes for USB, 2 lanes for DisplayPort) and why 2x2 isn't supposed to require new cables.
Thunderbolt still works at 20gbps over "normal" superspeed USB-C cables; the markup you're paying is generally for cables rated for 40gbps.
Your first paragraph is technically right but in actual reality cables with only 2 superspeed lanes exist. Also, the article (and comment) aren't solely about Type C; USB3 cables going to good ol' A/B will only have 2 superspeed lanes.
Your second paragraph is half-wrong. Thunderbolt cables must have an e-marker: refer to https://www.usb.org/sites/default/files/USB%20Type-C%20Spec%... Figure F-1 ("SOP'"). If the cable has an e-marker but doesn't have the Thunderbolt ID, you get passive Gen2. However, the problem is that USB3 TypeC-C cables aren't required to have an e-marker at all... in which case you drop out at the very first fork in the flow diagram.
This also leads to the funny situation that any 5A USB3 cable should work for Thunderbolt (because 5A requires the e-marker), while 3A cables may or may not work. I mean, of course the power rating makes complete sense to determine TB3 capability, right? :D
> The article is, however, correct in its conclusion: a "Thunderbolt" cable will always work for all USB standards.
Aren't active (i.e. >50cm) Thunderbolt 3 cables incompatible with USB 3 speeds due to having some active amplifier incompatible with the USB line signals?
Teeeeeeechnically I think yes, however in practice active cables are extremely rare (they start beyond 2m, not 50cm EDIT: incorrect, sorry!) and the redriver might even work for USB3 even if it's not specified. There isn't really all that much difference on the barebones electrical level; the most significant factor is the symbol rate, which is in the same range.
(Not sure if the Type-C negotiation will prevent them from working; it might.)
Ed.: after a quick check, active cables seem to crop up at 40G && >= 1m. 40G 2m cables seem to all be active, at 1m it's either. Couldn't find any "20G" active cables.
From what I've read, the redriver does break USB 3 connectivity [1], but it might also be the marker chip not explicitly indicating USB3 support.
And I think the maximum length for passive Thunderbolt 3 40 Gbit/s cables is indeed 50cm [2]! 50-200cm would be 20 Gbit/s only; anything above always requires active cables.
You're probably right, I can't check the TB spec but from what I see in shops I'm gonna say those "passive 1m 40G" cables are manufacturers breaking the spec. I'm also seeing "20G" cables that claim to only do 5G USB3, which is concerning & confusing me quite a bit...
FWIW my confidence level on "would an active TB3 cable work for USB3 if it had the proper bits set" is somewhere along 66% to 75%; this is based on the SERDES in some (most?) TB3 host silicon being able to switch to USB3 to "downgrade" a port... meaning the electrical level can't be all that different... but I haven't dug into this.
While we're at it, I forgot the 2 SBU pins too, so that's 2 extra wires for TypeC-TypeC cables... neither USB nor TB uses them; native DisplayPort does (not including DP over TB)
About a year ago I switched to an iPhone after being a longtime (~10 years) Android user. Last night I was at the grocery store and impulse-bought a Lightning charging cable to carry in my hiking backpack. No way I would ever have trusted a checkout-lane USB cable, but Apple's tighter controls give me a greater assurance that the cable will work well and not damage things.
OP never said if he bought a legitimate Apple cable or a third party one. I don’t think I’ve ever seen genuine cables in these kinds of shops even when I travel to Europe but that could still be a locale thing.
If that's a third party cable, you did take some risk on your iPhone. Third party cables and chargers (made with cheap components and poor design) may not function as expected or can be harmful to the device and to you (or people around you), at worst.
Apple keeps a pretty tight lid on MFI though, right? As long as the box has MFI on it, it's probably legit, unless you're really looking on the grey market, I think.
I really doubt Apple is able to police the supply chain that leads to the check out counter of a convenience or even grocery store. There's no way to know that the MFI logo on the box is legit or not.
I don’t know how successful they are, but the whole point of MFI is to give Apple the ability to send the USG after you if you use the logo without permission. I mean, there are also fake Rolexes and fake Gucci bags, but those are the exception because the companies mostly do a good job of suing the violators.
It cannot be emphasized enough how much the recent USB specifications are dropping the ball. I wouldn't mind paying a small premium just for a guarantee that things will Just Work when I connect them.
Instead, USB device manufacturers want to cost optimize everything, which probably forced the standard to become so Byzantine. Understandable from a manufacturer's point of view, but terrible from an end user perspective.
I lived through the pre-USB world, and things were even more "fun".
In all seriousness though, the current state of a USB is ridiculous.
My wife innocently asked me why her USB C phone isn't triggering Android Auto in the car, and straight away you know it's because she bought a USB C power-only cable without knowing the difference.
Our second car, a VW, comes with its own branded cables for this very purpose. My guess is that they got so many customers calling them about problems that it made sense for them to throw it in.
They're pretty nice too, the only made in Germany USB-C/lightning cables I have ever used.
Same with Audi. Funnily enough, we got an Audi A4 a while ago and decided to include Android Auto for obvious reason. The car supports Android Auto only (no CarPlay), but came with a USB to Lightning cable. We were never asked which cable we want, so I assume that's a standard
TBH, all that's required for this label is that the final product is assembled at a plant in Germany. So what usually happens is "bulk ship from anywhere, have a plant for packaging the individual items in Germany."
Really? So people are supposed to plug their phones and laptop to public USB outlets with full USB cables? Given how many USB security vulns there are, I'm glad I can use a power-only cable when plugging to a third-party USB plug even if the USB standard think these cables should not exist…
Power-only cables really are not a thing for USB-C. You need at least the configuration channel in order to negotiate charging voltage and maximum current.
Maybe it's possible to make one lacking both USB 2 and 3 data though, but I haven't seen one yet.
Nor have I seen a public USB-C outlet, for that matter, and I probably wouldn't be plugging my laptop or phone into one anyway: My own charger doesn't only protect me against bad intent, but also against cheap charging circuits that might or might not accidentally expose 220 Volt to my laptop's mainboard in case of faults.
> Power-only cables really are not a thing for USB-C. You need at least the configuration channel in order to negotiate charging voltage and maximum current.
Ouch. USB-C is even more fucked up than I thought… How does that even works with USB wall chargers?
> Nor have I seen a public USB-C outlet, for that matter
There are plenty of USB-A outlet everywhere (airport, trains, hotels, etc.), and most recent Android phones have only an USB-C port…
> Ouch. USB-C is even more fucked up than I thought… How does that even works with USB wall chargers?
If they want to supply more than 5V/3A, they need to support the power delivery protocol too.
> There are plenty of USB-A outlet everywhere (airport, trains, hotels, etc.), and most recent Android phones have only an USB-C port…
Exactly: These are USB-A outlets. These are possible to implement using only a resistor network to announce the maximum charging current and always supply 5V, which is much easier to implement than variable voltage and the power delivery protocol. USB-C can do this too, but only up to 5V/3A.
USB-C devices are usually backwards compatible with all three of these when using an A-to-C cable or adapter: Legacy USB-A current identification via the D+/D- pins, USB-C resistor-based current identification via the configuration pins of USB-C and USB power delivery.
You need an active condom (has to man-in-the-middle-for-good the negotiation but pass no other data). I have one because a friend made a few; I've never seen any for sale.
They are, and frankly that they need to exist is the single most egregious example of why USB isn't what I need it to be. What (I'm surprised) I haven't seen is a male-to-female adapter with a switch on it that physically kills the data tx lines.
As far as I understand, that would make the cables near-unusable for actual charging as without the ability to negotiate power settings (over the data lines) they have to default to an extremely low charging rate. So instead of a physical switch it would have to be a smart device with logic that allows for the charging "handshake" but kills other data transfer.
It's just a choice. A smart charger can detect if a battery is present, or at least some load that is safe to dump power into.
You'll note that many chargers started dropping handshaking because it was inconvenient. To be compatible with anything you need it, but there's many that don't ask, though they may only work with the equipment they were sold with. (E.g. defaults to 5V 2A charging.)
It's not a question of chargers: It's a question of laptops in standby.
The historical design intent of USB is that, even when your laptop is in standby it still powers devices like your keyboard (so you can press keys to wake it up) and in exchange, devices promise to consume less than 2.5 milliamps (12.5 milliwatts) until they've negotiated permission to draw more from the host. After all, it'd suck if your laptop battery was going down noticeably even when the laptop was in standby.
Of course, loads of vendors of cheap devices ignore this - the makers of a $5 rechargeable bike light or USB fan aren't going to put in the components needed to negotiate charging speed. But in principle if you made a USB cable with only the power pins connected, compliant devices should only charge exceptionally slowly, if at all.
(Wall chargers, instead of enumerating as a USB host to negotiate power, used to put a certain value of resistor across the data lines, to signal what current the device could draw - this hasn't always been standardised, which is why phones and USB chargers can be incompatible)
Sounds like historical revisionism. Power limiting is possible without negotiation. It's easiest, for me, to find references to the safety aspect more than anything else.
Note also a fair number of laptops (read: every one I have used and checked) use PS/2 internally because it is interrupt driven and even lower power than USB. There are also plenty of laptops that advertise high current phone charging while off, some of which do negotiation, some of which don't.
Yeah, USB is seldom inside laptops, but you might be interested to know that it is becoming increasingly common to see HID over IC2 [0] used inside of laptops instead of PS/2.
I've definitely seen it. Reusing HID is kind of clever, but the protocol is extremely bloated and hard to parse.
Ignoring that, I'm not sure it's an improvement. Most of the power savings is having the keyboard initiate an interruptible event to wake the computer. Interpreting HID like in USB means the keyboard can't wake the computer, the computer needs to remain on to service USB interrupt endpoints.
> A smart charger can detect if a battery is present, or at least some load that is safe to dump power into.
> defaults to 5V 2A charging
i don't think you understand how power transfer works.
voltage is applied (by the charger), and current is drawn (by the device that wants to charge). a charger cannot "dump power into" anything.
when chargers and devices attached to them engage in some sort of negotiation, that's not the device telling the charger what to do, that's the charger telling the device what its limitations are.
if you attempt to draw 3A from a device that can only do 2A, the voltage will drop outside of the specified range. to the extent that a charger can limit the charging current, it does so by dropping/cutting off the voltage until the current goes down. which isn't ideal for devices.
(perhaps you're confused by thinking that USB "chargers" are like battery chargers, and arbitrary USB devices somehow act like batteries. that's all wildly wrong.)
I'm aware. The potential "dumps" current into the load proportional to its voltage. (A constant current supply would raise the voltage to achieve the requested current but the USB bus is not a constant current supply. There are multiple ways your yes-but is annoying and unhelpful.)
The main concern behind negotiation seems to be safety of the user and safety of the power supply. Detecting a safe load means detecting a not-short.
The basis for the 5V power negotiation on USB is extremely silly. The power supply is already current limited, protecting you from shorts, and the supply voltage is ~5V, quite far from anything dangerous.
> The basis for the 5V power negotiation on USB is extremely silly.
This may be true for wall chargers but not devices capable of supplying power to peripherals while on battery power themselves.
For example, an external hard drive should not have to figure out that a large tablet can power it, while a phone won't, though trial and error, by attempting to spin up the disks multiple times.
True, but opt-in seems to do what users expect most of the time. Enough device expose their full power without negotiation that there are workarounds like sleeping a USB port instead of relying on power negotiation.
Devices are going to misbehave anyway, it's probably more important to default to a reasonable level of mostly works.
I've got a (USB2) power monitor that includes a chip (just an STM8L051F3 microcontroller) to manage power negotiation and keep data disconnected. So it negotiates power with host and device separately, without ever needing to connect the data lines from host to device directly. I expect there are or will be versions for USB 3.<whatever I've lost track>.
For USB 2, widely-available battery packs compatible with whatever current negotiation standards your devices require work just as well, and have the added benefit of storing power.
Less generally, these batteries can also break ground loops, which comes in handy when connecting a power supply to both a USB device and an attached analog audio device (e.g., iPhone and wired noise-canceling earbuds).
It would be more convenient. Given that this is supposedly a security feature - no. Nope. Not at all: what is the state of the data line(s), and how do you know that it matches whatever indicator the cable uses? (There's no way to tell, and you don't - if software is involved, securing stuff becomes HARD, and proving security even harder.)
OTOH, a USB Y-cable physically has no data lines connected on the extra USB-A male connector; it thus provides a far stronger guarantee that there is no data travelling across them. (Not foolproof, but safer than a software-controlled switch.)
That's my point. If I can't tell whether this charging cable is secretly an USB keyboard, isn't it better to explicitly tell my cellphone to treat it as charge-only instead of relying on physical solutions?
My use case is I don't necessarily trust one end of the connection to behave, so I want the ability to just give/take power and involve exactly 0 software. I understand that's not the usual consumer story, and the best argument I can come up with for the general case is it could make any USB compatible wall plug safe to use to charge up.
I'm talking about using the phone's software to block the data connection.
If you don't trust that cellphone won't transfer your private data via USB, I don't understand why you trust it not to do it via any wireless connection.
Spec-compliant USB-C cables are still required to have the USB2 data lines hooked up, even "power-only" cables, and Android Auto / Carplay don't need beyond USB2
>USB device manufacturers want to cost optimize everything, which probably forced the standard to become so Byzantine
As Intel loses its raw CPU performance advantage, it's new marketing materials promote "features" and "convenience". Practically speaking, USB is defined by Intel, and it is making a visible choice to allow itself a position in the market for Thunderbolt 4 by making some USB4 features optional so they can include them in TB 4, and adding other small minimum performance guarantees and hoping that is enough to sell TB 4 chips to high-end AMD motherboard makers, which in turn makes their own CPUs (with built-in thunderbolt 4) more price competitive. It is a sound business plan, but not in the best interest of consumers.
> It is a sound business plan, but not in the best interest of consumers.
The fate of a universal lingua franca connector is too important to be left to just one company. Why is it still being defined by Intel rather than a panel?
I blame a combination of poor enforcement of trademarks and a desire to let people support the latest specifications with the cheapest possible build cost.
The poor enforcement of trademarks being - that there should not be a plethora of things claiming to be USB which are not spec compliant, nor should people be using names like "USB 3.2 Gen 1x1" vs the marketing name "Superspeed USB".
You also see this poor enforcement with other organizations, such as DisplayPort vs DisplayLink.
This spills over to problems with cabling, such as how much wattage is supported for power delivery or what speed data transfer is available.
Oddly Intel is the one stepping up here with Thunderbolt 4, because it is now more of a certification than a specification.
You can't blame the lack of "enforcement" to be the cause of the ills created by bad standards. I'm fairly certain that 2 official certified USB-3 cables can brick devices even though the physical form factor is the same.
Relying on cables internals for function is just as stupid as can be.
I think USB-C would have been so much better if the CC system were saner - for example we could have had a shift uart like register based api:
- Host queries some property and each cable along the way appends their value to that message and passives it along.
- The endpoints either loopback if they're the new standard or pull down the pins if they're USB-2 only.
- It follows that if no message makes it back then we're limited to USB 2.0 and if a message does make it back, we can query/configure all elements along the path of connections. Right now this is impossible which makes it impossible to detect if there is a non 50v tolerant hubs between two hubs.
- The programming/hw implementation for PD and alternate mode would also be non-insane in this model. Currently PD requires stupidly complex state machines, op amps, resistor banks or some autonomous IC to pull off and there are so many screwups because it's such an obtuse standard.
- Supporting alternate modes, and providing more diagnostic information would be much easier.
There should have only been 2 passive cable types - 2.0+PD and USB 3 gen 1 - everything else should require a smart tag on the chip.
This so much. The current USB @&$#-up seems to be the result of "EEs try to expand line-level encoding to create an API."
This should all be solved at the negotiation layer, even if that needs to be made more complex, so that the remaining components can be simpler and reasonably-behaved.
Instead, we got something that allows each device to be a bit more electrically simple, at the cost of ballooning complexity for the ultimate use case.
USB-IF took their eye off the ball, and wrote a spec for manufacturers, without thinking about the consequences to consumers.
At some point, it's a value trade-off between {working product for use cases} and {+$2 on BoM}.
There are an added pair of pins/single wire in USB-C cables that allow the devices to a) detect the orientation of the plug and b) do extended power negotiation for the increased power capabilities of USB-C.
If you think USB is bad, try looking at all the various things that can be connected together with M.2 slots vs. the pairs of devices that will actually work together despite having the same slot. I have a degree in electrical engineering and after years I only just realized that there are actually PCIe M.2 storage devices that only support AHCI mode not NVMe.
> I have a degree in electrical engineering and after years I only just realized that there are actually PCIe M.2 storage devices that only support AHCI mode not NVMe.
To be fair, there were only a handful of those really early on when M.2 was just getting started and OS/BIOS support for NVMe wasn't universal yet. And hardly any of them were released as retail products; they were mostly OEM-only drives. All of those drives went out of production 3-4 years before the first host devices that require NVMe and can't work with AHCI started to show up: USB to NVMe bridge chips for external enclosures. So if you haven't encountered an AHCI M.2 SSD yet, you probably never will and knowing about them is just an obscure historical curiosity.
Consumers want one cable and connector for all devices regardless of the ridiculous notion that all devices should have similar enough electrical requirements that they could use one cable. The result is USB.
It's no accident that the USB group is called the "USBIF" -- "USB Implementor's Forum". This is different from, say, the C++ standards committee which is a combo of compiler & std library developers (i.e. implementors!), enthusiasts, and some who are best referred to as "users".
Not a criticism of the USBIF* just pointing out the motivations to underline your point.
* BTW I have plenty of criticism, this just happens not to be an example.
> It cannot be emphasized enough how much the recent USB specifications are dropping the ball.
This outcome was pretty obvious, it's literally in the name of the damn thing: UNIVERSAL Serial Bus. The only way to achive that is to lower yourself to the lower common denominator.
But that's the point, right? For a manufacturer "universal" means the lowest common denominator whereas for an end user "universal" means...well, universal.
>Up to that point in time, it was a good tech stack.
USB 0.9 was 1995, USB 1.0 1996. Two years later we had Bill Gates getting embarrassed by Windows 98 BSOD on stage https://www.youtube.com/watch?v=IW7Rqwwth84. By ~2002 we ended up with 6 separate incompatible proprietary variations on micro USB. Pretty much every second Camera vendor had its won "standard".
>USB-3.X cables features 8 wires .... and the other six carry data
Sadly just adding even more confusion to the mix. 24 pins, up to 13 signals, 4 pairs (8 wires) dedicated to USB 3.2 alone.
>Thanks to DP 1.4, the bandwidth requirement can be lowered via DSP lossless compression
oh its very much lossy, Display Stream Compression (DSC) is "visually lossless" :--------) it works similar to Texture Compression, but works on blocks of lines instead of rectangles, still it does destroy information by cutting off color accuracy.
At the end of the day USB is still trivial compared to Bluetooh.
Yeah, the author accidentally swapped the new names. The rebrandings never changed the "Gen X" part.
That's what I don't understand: Why go with both minor versions and the Gen's in a single name?
They could have just retroactively rebranded USB 3.0 to USB3 Gen1, and then went USB3 Gen2, USB3 Gen3 for the 3.1 and 3.2 specs (2×2 is another mess [1]).
Tech people would have understood that the specs are 0-indexed and the marketing names (GenX) are 1-indexed. Consumers would only see the Gen on ports, cables and devices.
Still, I don't even understand why hiding the raw speed number from consumers is so important: Short Numbers are always better than vagely similar marketing names:
- USB3 (5Gb)
- USB3 (10Gb)
- USB3 (20Gb)
- USB3 (40Gb) or (2×20Gb)
Done.
I still think it's all worth it because we have a single, reversible port for 5V, or usb2/3 (gen1).
The confusion created by PD/3.1/dp/hdmi/tb/etc is mostly irrelevant to me, and I think the clusterfuck there is real but overblown (so long as pd devices will charge slowly from normal 5V c ports/cables, and usb3.whatever will fall back to usb2).
I can charge my travel router, dji gimbal, gopro, android phone, ipad, and macbook pro (albeit slowly) from the same cheap cables. Neewer battery chargers are now dual micro/c, too.
Even if it were just a 5v connector it would be a win.
I believe there was a charger that would negotiate a higher voltage, but didn't notice when you unplugged the far end of the cable and kept the near end plugged in. Then you could attach a new device and possibly fry it. I'd file that under "poor implementation".
Depending on your job description, you might see that as a feature ....
BOFH: "Oh, the user connected an unauthorized USB device; let's see, how it handles 20V"
And that's just the stuff that makes a good-faith effort to the specs. My wife bought me an LED light panel for Christmas so I wouldn't be so grouchy in the winter (gift for me or her: you decide).
It uses a USB mini connector for a 19.whatever volt power supply. Don't plug that into your external HDD by mistake.
Yes, I know USB mini is obsolete, but we still own a couple devices that use it. And the plug is stamped with the USB logo.
Seriously, what is going on with USB? I bought a charging cable for my phone once that supports fast charging. The connectors all fit but no fast charging? Shipped it back and bought another one - what a waste of resources.
They really need to use semantic versioning and be forced to clearly display that version number on the cable or connector in a legally defined, readable size. In fact every cable type should have that. I recently bought a 2K 144Hz screen and could get the display manager to set 144Hz. Then I looked up the standard and found out HDMI 1 doesn't support 144Hz but HDMI 2 does. Not a single cable in my house has the version number written anywhere and I had to try them out until I got the right one.
What I would really appreciate is:
Cable X not compatible with Cable Y --> different major versions and completely improved specs, power requirements, AND different shape
Cable X.X is compatible with Cable X.Y but has different functionalities --> same major version, different minor version, same shape. Easily understandable by users "First thing I see is a big X, it fits with whatever I'm trying to connect with, it should work". If it doesn't then they can look closer and go "ooh.. the small Y is different. I should get Y or Y+n".
Yes, TB (both 3 & 4) cables always have all USB features. They can still be 3A or 5A for power delivery though.
As far as hubs are concerned... they're designed for TB uplinks; AFAIK most silicon that exists can fall back to USB3 with significantly reduced functionality but you're gambling again at that point. I'd recommend against it.
Yeah, people want to have their cake and eat it too. They want every device to support and pass through every modern feature, the highest speeds, and the fastest charging, but they also only want to pay less than $5 per USB interface, and they want full backwards compatibility with all historic devices. It's just not possible, we are pushing the boundaries of what's possible physically. Apple solves the problem by over-provisioning everything; but as you note, Apple products are expensive.
Additionally, there are hundreds of manufactures making equipment that uses the USB ports, but which don't follow the specs beyond what's needed to "make it work", and they can't be stopped through legal or trademark enforcement as they're largely-foreign, and they just fold and re-appear as a different company if ever challenged legally.
Seems like a well worn path: either pay more for the full-featured option, pay less for the minimal-feature option, or do the research to find the cheapest option which does everything you need.
Hopefully we'll see more Thunderbolt 4 silicon from different manufacturers as USB4 rolls out. Seems like we'll probably see that over the next year or two if it's going to happen.
It was perfectly reasonable to allow partial implementations among peripherals and host controllers to let cheap and simple devices exist. But allowing different cables was a mistake. Every single USB-C cable manufactured today must be required to support Thunderbolt, high-current charging and everything.
And incapable of most common USB use-cases even at the time.
Also, "simple" has to mean something else in your world because I distinctly remember the onus of having to figure out the right baudrate, stopbit and parity bit configuration. Not to mention devices which used TTL level RS-232 and wouldn't work with other devices DESPITE having the same connector and standard.
Plus the D-sub connectors were standard, but anything regarding their wiring was handled by different standards. So just putting connectors where they fit wasn't enough to be sure that it would actually work and could even fry one or both ends of the connection.
With USB, as long as the both ends and the cable are built to spec, it's safe to mix and match. You'll just end up with the lowest common denominator of functionality.
If one ever catches on, people will start wanting to use it for more than it was designed for, and it will end up being extended to the same extent USB has
We used to have the "slow port/fast port" dichotomy with USB vs FireWire, but people were too cheap to pay for the fast port and eventually the slow port got fast enough to replace the fast port.
Also, are you going to put two ports on every phone, tablet, etc.? In times when even ultra expensive high margin devices like iPads / iPhones are dropping ports to save money?
As annoying as it is, especially when docks/hubs come into play, part of me feels like having one solution that at least semi-works for maybe 80-90% of use cases is worth having to debug silly issues sometimes.
Have a feeling we'll end up with an xkcd standards [0] situation if someone comes along with a competing protocol.
I'd be really curious to understand the origin of this "USB required 3 tries to plug in" thing.
It seem this is something everybody agrees with nowadays, but was it always like that? After all, USB-A plug has a prominent logo (distinguishable both on sight and by touch) that identifies up-side. It should be dead-simple to do, no? What went wrong?
- Did some laptop manufacturer mount USB-A receptacle upside-down?
- Or was that micro-B that trigger this? (for example Samsung/Nexus S and LG/Nexus 4/5 have micro-B receptacle upside-down relative to one another)
- General feeling of inferiority from Apple's Lightning plug?
Half of the time logo is missing from cable or flash drive (especially designer drives). Another problem is not everyone is using laptops even today, and that meme is originating from the times when laptops were much more rare and less usable. In the PC you just don't see whatever happens in the back when you are plugging in cables, especially since most of the time ports are vertical.
And the meme itself comes probably from the fact that USB-A port needs rather precise positioning to connect. Even if you are inserting connector the right way from the start, you will most likely hit metal part of the connector against metal parts of the port at first try and it will cause you to doubt yourself. You then turn it (incorrectly) and fail again, turn again (correct now) and try connecting more slowly and precisely and only then it fits :)
I think the second one is really the answer. Cheap laptops of the day had a lot of tolerance for the housing that lead into the port so if your weekend misangled you'd fit it pass the chassis but hit the edges of the PCB recepticle. I think new decides tend to have a rounded lip or some mating feature between the chassis and the PCB mounted recepticle so it's less of an issue. Higher end models often only have the tongue as part of the PCB leaving the outer bit to the chassis.
The absolute worst was a Dell laptop I had around 2008 with a combination USB-A/eSATA port. If you tried to plug a USB connector in upside down, you were basically connecting the USB pins of the device to the SATA part of the port and the computer would inexplicably reboot.
> Half of the time logo is missing from cable or flash drive
The other half of the time the logo is faintly embossed in black on a black substrate that your ambient lighting is just dim enough to render not visible.
Some manufacturers indeed mounted the port upside down (not typically on laptops, but on other devices). When the port is vertical or even in a horizontal surface, people can have trouble working out which side is meant to be up. Some manufacturers (Logitech was one) made their cables with an embossed logo of their own on the side opposite the USB logo, sometimes even without the USB logo. Pretty much nobody knew about the logo thing.
Even without all that, the feeling of being slightly misaligned with the port and the feeling of trying to plug it in upside down are hard to distinguish, with many port mountings, so people assume they've got it wrong and switch orientation.
USB-A requires some force to be plugged, and most users are "gentle" when plugging (not using too much force). So what happens is, the user tries it first in the correct orientation, but it doesn't enter (because the user was too gentle). The user flips to the wrong orientation, and uses a bit more force (but it won't enter because it's upside down). The user then flips back to the first orientation, and uses even more force; this time, the force is enough, and the plug enters the socket.
I have ~2 devices with upside down usb ports, I usually look for the holes in the connector instead of the logo to see which side is up.
The major feature of usb-c isn't that it's bi-directional as in "you don't have to try 3 times", it's bi-directional as in "it's both an host and perpipheral port" but somehow everyone seems to forget that. That's why it's a C connector, it's both A and B
Anecdata: My mouse is connected to my PC in the front, so I just reached to check the logo and I'm feeling a logo on both sides of the connector. So I unplug it to check and it has the USB logo on the top and the Genius logo on the bottom. C'est génial! <facepalm>
I think it's because USB flashdrives don't have the logo, or might have a company logo on the "wrong" side. In some cases the USB receptable would also be mounted vertically
- That USB connector in some dark recess of your car.
The ergonomics of reaching it from the driver seat while fully seated in that seat must be an interesting study.
I owned one car long enough to be able to get the connection on the first try multiple times.
On the USB-A plug you can see inside, so if there's no logo you can go by the plastic tab inside the connector, usually white or blue plastic. The tabs have to be on complementary sides.
It happens every single time you plug something into a USB A port, I’m surprised you haven’t witnessed it. They are very often upside down, sideways, and the mechanic design gives you no feeling/feedback that it’s going in the right way, it feels the same until it connects fully.
When USB 1.0 was first introduced, all connectors it replaced had to be plugged in the right way up, so non-reversibility wasn't seen as a problem at all.
USB 1.0 needed to be cheap and simple to be adopted. Wrapping a bit of metal around a PCB was cheap and simple.
It was then as it is now- A connectors are four dimensional. The only difference is that back then you knew you had it right on the third attempt because your PC would reset.
I just hope the USB port could be of the uniform size of that of USB 2.0, not that of USB type C. I feel small size is not convenient and mechanically weak.
This is particularly noticeable with things like Thunderbolt docks, where knocking the cable slightly can cause connectivity to the dock to blip momentarily, which has the knock-on effect of a 5-10 second delay while everything gets straightened out again, killing display, network and any other downstream devices for teh duration.
In my experience, this also happens often with older/worn-out USB-A ports. What's worse is that, unlike USB-C, the springs on USB-A are on the socket, so replacing the cable doesn't help.
Cables with power delivery have a single wire for power negotiation (CC) and a single pair of power/ground. The connector, on the other hand, has two CC pins (one of which is connected to the CC wire, and the other is not) and two pairs of power/ground. It also has a few other things - a pair of wires for weird low speed uses (SBU) and an additional pin for supplying power to the cable in case the cable itself has a chip in it.
I would go against the introduction (USB is simple and reliable): it took about 10 years to stabilize in Windows. During that time serial (RS232) devices were more reliable than USB ones.
That is mostly an issue with Windows' device driver architecture and not with USB itself. Windows tries hard to assign unique and persistent internal "name" to every device it has even seen, the resulting device record then has bunch of named attributes which are used to match correct driver to it. In context of USB there are two problems with this otherwise pretty clever and universal design: there are pretty common edge cases when the "unique and persistent" name is neither and then there are drivers (including drivers supplied by MS as part of windows itself) that match on wrong attributes.
Honestly, you're not wrong. It seems going back to having a separate plug for thunderbolt would solve a lot of problems. Effectively, you already need to buy special cables for it, so they might as well look different to prevent confusion.
Going back? Thunderbolt 1 and 2 were both Mini DisplayPort connectors. I don't disagree that having a seperate plug may be sensible, but all iterations of Thunderbolt so far used a pre-existing connector.
Nope... it's even more complicated. Up to "3.2 Gen 2x1" they have 8 wires, but "3.2 Gen2x2" has 12 wires. There's two pairs of RX and TX each, i.e. RX1- RX1+ RX2- RX2+ TX1- TX1+ TX2- TX2+ D- D+ PWR GND = 12.
And... that's still not correct: USB-C to USB-C cables have an additional CC line which carries the Type-C extra negotiation (e.g. alt mode switch, power levels.)
And just to fuck users over even more, there is a difference between a 13-pin full-function "USB 3.2 Gen2x2" cable and a 13-pin full-function Thunderbolt cable: the Thunderbolt cable reports an extra "vendor descriptor" on the Type-C negotiation channel, indicating support for TB. The cables are otherwise physically identical, but these extra bits come with a ~$15 markdown.
(Oh and the PWR/GND lines can be 3A or 5A, but at this point...)
The article is, however, correct in its conclusion: a "Thunderbolt" cable will always work for all USB standards. The only free variable on a TB cable is whether it does 3A or 5A for power.