This is a great answer, but IIRC there's an other reason to prefer micro over mini: the moving parts in the micro-USB connection are in the male plug, not the female one like the mini.
Since those parts tend to break easily, it's cheaper to replace the cable rather than the smartphone/device having the female port.
That being said I'm still a bit mad at Apple for ditching their old large port, only to replace it with an other proprietary standard.
Apple's port, which doesn't have an up/down direction, is much easier to plug in than the micro USB port, which does. It is an innovation, like it or not, that USB doesn't have.
It is a good innovation indeed. That being said, I'd rather deal with the minor annoyance of having to figure out which way to plug my USB cable than having to carry two sets of cables/adapters to accommodate all my devices.
Apple sells a micro-USB to lightning adapter (http://store.apple.com/us/product/MD820ZM/A/lightning-to-mic... ; as low as $2-$5 for third-party versions) that allows them to comply with European micro-USB charger mandate, but it's so small I'd be more worried about losing the adapter than having to carry around another cable.
Yes, but sadly one particular very large Korean manufacturer whose name begins with an S thinks that the "proper" orientation of a micro USB port places the narrow/tapered side "up" instead of "down". And they print all their cables opposite of everyone else. Sigh.
But even so, there's no tactile difference. So cabling a phone in the dark (pretty routine at bedtime) is a crap shoot either way. I personally solve this with a bedside dock, but it's a legitimate fault.
It's a great, great design when considered in a vacuum. Of all the cables I use (and I use many), it's the one I would choose if I could magically make that what all my devices use.
But, when taken in context of the real world, where only Apple devices have this svelte and clever and proprietary, closed, licensing-hostile connector... on balance, it fucking sucks. I would like my iPhones and iPads significantly better if they just would just grow up and use micro-USB.
Mini-USB almost got there, but due to the flaws detailed in the OP, couldn't really become The Standard. But Micro-USB did. Virtually everything uses it now. My bluetooth headsets, my wireless mouse (to charge), my Geiger counter, all my non-Apple phones and tablets, my pocket Wi-Fi terminals, my music players, my exercise gadgets, etc etc etc...
Every time I plug in an Apple device these days, I grimace and feel like I am driving this really awesomely designed car that looks great, handles great, is really comfortable... but only runs on this special gas that I can only buy at one gas station in town, that takes holidays at random.
The Lighting connector is a lot better than the gargantuan, ugly, hard to use Dock Connector of yesteryear (or even this year, if you are buying the cheap iPhone). But the time for proprietary non-standard connectors for mobile devices is past.
If I had to use a micro-USB charger to charge my iPhone, I would die a little inside every time I plugged it in. Every single USB connector ever is a PITA to plug in. The Lightning connector on my iPhone is amazing. I can plug it in purely by feel, in my dark bedroom after turning out the lights. I dare you to try that with a USB connector. And the damn thing feels really sturdy too. I have no problem picking up my iPhone by the cable (not that I do this regularly), but I would never trust a USB connector to keep my iPhone from falling to the floor.
Im just saying for me pretty-good-and-completely-ubiquitous beats sublime-design-but-have-to-worry-about-having-the-right-cable. It is indeed easier to plug in the Lightning phone charger in pitch black darkness. But I haven't needed to do that yet in 2013.
What I have needed to do is ask a bunch of people if they happened to have an iPhone cable -- oh no, sorry, not that one, I mean the newer one, thanks anyway... no? nobody? Ok fuck it I guess I will just turn off my phone to save that last 3% for an emergency.
That wouldn't happen if I just needed a standard micro-USB cable that pretty much every non-Apple device now uses.
Not surprisingly, I guess, I don't think is was a misstep; one of the reasons I love the Mophie cases is because they add a normal, standard port to the iPhone.
These are trivial to plug in purely by feel in the dark.
And as a bonus, they charge many of my devices! I "die a little inside" every time I have to use a special cable for an iDevice (or a Samsung tablet!)
Interesting that you consider it an advantage to have a connection sturdy enough to pick up the device by the cable. You probably don't care much for magsafe adapters...
Lightning is extensible, thereby preventing the "in fifteen years once every single hotel and gym has adopted it we will have to force them all to change again" problem which affects both the dock connector and micro-usb.
Those that think lightning connectors are just a pain in the ass haven't thought it through.
Additionally, it is one of the best physical connectors ever designed - a good choice for a reprogrammable interface that we are hopefully stuck with forever.
Lightning has one specific feature that allows it to easily be bidirectional. The pinout can be changed via software, to adapt to any occasion, and essentially manufacture a new plug with a firmware update. This means orientation doesn't matter (without duplicating pins). This is a bit harder with USB, which has specific purposes for each pin.
It's not that hard to simply swap all the pins in hardware if the connector is reversed. You don't need to able to reassign every single pin to make that possible.
Lightning is overdesigned which makes it expensive.
You could say that the previous iPod dock connector was over-designed. But it wasn't, because it survived many years and many devices, with only a handful in between (like some iPod nanos) not being amenable to it years after its creation. We do not yet know what fate awaits Lightning. The odds are certainly tilted against it. Not only is Android popular and universally using Micro-USB (unlike when the first iPhone came out and phones had all kinds of plugs), but "smart wires" are increasingly antiquated: if you want to pipe music from your phone to your car, wouldn't you rather use WiFi than a cord? Sure, you need to plug in to charge much of the time, but now you can hand the thing to someone in the backseat and let them change songs. And any dumb USB port will charge any phone these days, which is good enough.
It clearly didn't? I'd say the iPod dock ecosystem and all the optional cables (like VGA, HDMI, digital out etc) more than justified the cost of the 30 pin connector.
> if you want to pipe music from your phone to your car, wouldn't you rather use WiFi than a cord?
No for several reasons (and since I haven't seen a single car with wifi music built-in, I'm going to assume you're abstracting wifi/bluetooth here into wireless):
1) Wireless drains batteries and I may not have a charger
2) Wired = charge-capable, likely charging.
3) Wireless setup requires authentication and configuration, often from the car itself. My 2012 Sienna is a great vehicle that is comfy on a road trip for 7 ppl, but it's BT implementation is a godawful nightmare. Setup is required to be initiated from the car, and can only be done through voice, and when you're not driving (even by passengers). Also, it tends to "lose" configs every once in a while. Meanwhile, the USB port just plays whatever's connected to it (we use mainly iDevices, so not sure about droids) and the in-wheel controls work fine.
No, because my car is going to LONG outlast the proprietary and most likely terrible app that my car's manufacturer would write for the current generation of one particular phone.
Cars should never under any circumstances ever have specific apps like Spotify, Pandora, and Bing. In 15 years, people are going to feel pretty stupid about owning those. The 3.5mm TRS connector, however, is a long term open standard that will still be working long after Pandora folds.
> Cars should never under any circumstances ever have specific apps like Spotify, Pandora, and Bing. In 15 years, people are going to feel pretty stupid about owning those.
Isn't that the point? Its shiny at the time you buy it, but it quickly feeds planned obscelescence, increasing your desire to replace it later. Win-win for the manufacturer.
Really? Could you provide a source for this? I was under the impression that the 30-pin connector was (yet another) proprietary Apple hardware standard.
I'm struggling to find any source that directly confirms it, but it's most certainly the same plug used in this standard, and the Dell Streak which conforms to it.
The standard is for the specification of the pin layout not the plug. As all the links on the wikipedia page are broken and I can't remember the name of the plug itself, we're at a loss here.
PDMI certainly seems to be a copy of the Apple 30 pin connector, but it was introduced long after it. It wasn't electrically compatible, of course. To add confusion, certain Samsung tablets used a physical PDMI interface which was not electrically compatible with either PDMI or the Apple thing.
Overdesigned? Really? I'm not sure; it accomplishes the same thing as MHL, Slimport et al (which also tend to be expensive) and will be able to support USB3 when the time comes. It's far more futureproof than the old 30 pin thing.
It was underdesigned out of the box. From the documents I've read, USB 3.0 requires three differential pairs, whereas Apple only provides two with their proprietary connector. Plus, the built-in DRM means that it's a lot more difficult for hobbyists/experimenters to hack on it.
I'm sure they are doing this to preserve their patents, what is the point of an open standard if no one is allowed to use it?
I would really love to see a magnetic bi-directional cable eventually for phones/tablets. The Surface one is OK, but it doesn't snap as nicely as I think it could.
> I am mad at Apple for not making Lightning an open standard.
Lightning is just Apple's marketing term for Intel's Thunderbolt interface. The standard is available from Intel, and if it were not it would be more appropriate to direct your anger at Intel.
Thunderbolt is Intel's external, hotswappable PCI-e interface found on many Apple laptops. Lightning is Apple's proprietary interface found on iOS devices. Aside from being named after different aspects of the same natural phenomenon, they have nothing in common.
Well, being named after the same natural phenomenon is confusing enough. I often have to stop and think which is which. It's an understandable mistake.
You must have mistaken Light Peak for Lightening or Thunderbolt for Lightening. Light Peak = Thunderbolt, which competes with USB 3. Lightening is completely different.
Yes, let's just limit ourselves to standardized connectors, innovations sucks and we are tired of change. Now please get off my lawn. /sarcasm
Just let the free market decide things like this, it is extremely efficient at that. Right now, Apple still sells loads of iPhones without using a standard connector. And frankly, I think my wife's iPhone 5 plugs in a much better way than my more standard Nokia 925 does.
> Just let the free market decide things like this, it is extremely efficient at that.
Yes, it absolutely did not take an EU intervention to standardise mobile phone chargers on Micro USB, instead, the manufacturers recognised that it is a good thing to get rid off proprietary chargers as consumers all only bought the phones with the standardised chargers.
Or are you saying that needing a new, different, charger for your phone is actually a good thing™?
Apple wins, while the rest of us get stuck carrying around a wide variety of cables for no good reason.
The argument that the free market will find a solution works when competition is allowed. It doesn't work when every company who isn't Apple is legally prohibited from adopting Apple's connector.
There is quite a good reason to carry around an apple cable. The connector is just better than micro-USB. The better arguments are that Apple should share their innovations with the rest of the industry so they can all have good connectors, but why should Apple even do that?
Actually, the question isn't why Apple should share their innovations, but why the rest of us should enforce Apple's desire not to share. Would Apple really have refrained from developing Lightning without the ability to control it? Are we better off granting them this power?
Apple would have refrained from developing Lightning if it didn't give them a competitive advantage. I think that is obvious. Why would Apple just donate Lightning to the rest of the industry? How would we as consumers reward them for such altruism?
That I give Apple the power to develop innovative proprietary solutions is a choice I personally make and I want the freedom to make. Why should everyone be forced to buy Android? So the idealists don't feel bad about their standard but hard-to-use connectors?
How would we as consumers reward them for such altruism?
Even Apple can't invent everything they need themselves. They would benefit from a patent-free ecosystem by being able to adopt other manufacturers' advances, just as the other manufacturers would benefit from being able to adopt Lightning.
It's impossible to prove that patents are a net win for anyone but patent lawyers.
> Anyways, if apple's solution is simply better than the android micro USB solution, who wins?
I win, because I don’t have to buy a new charger for every phone, because I can use standard USB cables to charge my phone and because said USB cables are ridiculously cheap; allowing me to carry one cable to connect my phone, my camera and 90% of my other gadgets to my computer (the rest using eSATA).
Frankly, I don’t care whether Apple’s solution is ‘simply better’ if it is only available for Apple’s products and hence necessarily restricted to a small percentage of the overall market.
Even better, lets create a new proprietary system for every innovation (or arbitrary change), because the market is great at forcing interoperability between proprietary systems. Why wouldn't it be? The alternative would be a monopoly, or a broken, inefficient industry - and avoiding that is always primary in each individual actors' mind.
And frankly, the picture on my eTeeVee(TM) is much better than the picture on my Samsung, even though it isn't compatible with OTA or cable standards.
So you'd rather force the decision on apple rather than let consumers decide? Or are you claiming that apple has a monopoly on smart phone designs that don't suck, and they are abusing that?
Yes, it is a nice plug. No, it should not be proprietary. While Apple are entirely within their rights to keep it proprietary, it is in everyone else's disfavour, particularly their customers.
It is efficient at generating the best outcome for the company in isolation, but that may not be an optimal outcome for society as a whole - externalities are typically not priced into the product.
For example, consider the 51,000 tonnes of redundant chargers that were estimated by the GSMA to have been distributed in 2008 prior to the common charger initiative[1]. One can argue that the costs of landfill, carbon impact (est 13-22 million tonnes/yr), pollution effects etc. should be priced into the product, but the reality is that they are not.
I have no view on the iPhone connector vs uUSB, but standardization has enormous benefits for consumers and society as a whole, and typically it takes an external agency to "encourage" it.
[1] GSMA analysis from UNEP, Gartner, European Commission Integrated Product Policy Pilot on Mobile Phones, University of Southern Queensland data.
Regarding redundant chargers and their carbon impact: wouldn't it be wonderful if with every non-apple device using MicroUSB, devices didn't all come with a USB to Micro-USB cable and a mains to USB transformer?
Surely many of us have drawers full of these bits and only really need one or two to keep all our devices topped up.
Unfortunately, they keep increasing the minimum charge amperage on devices, so the micro USB charger I got with my first android phone won't charge my newest one.
Their options were (a) abandon video out and analog audio out or (b) keep the old 30 pin, which was really showing its age or (c) adopt either MHL or Slimport, which are in a format war and have significant downsides (MHL requires external power unless the display provides it, Slimport is niche and adaptors tend to be extremely expensive). And then, in the iPhone 7, when MHL or Slimport or something else has won, change to that again, breaking everything.
Not to mention the upcoming transition to USB3, which lightning will survive but current Micro-USB will not.
Finally, micro-usb ports are pretty delicate, and a common cause of phone breakage. The lightning port is far more robust.
To me, the biggest failing of micro-USB isn't the up/down thing, but its inability to back-power the same port that is being used as a host. That is, if I have a device with one port, I can't plug that device into a hub where it hosts devices and gets recharged at the same time. This is becoming an obvious flaw in the day of tablets-as-laptops - you can't make a "docking station" using a single USB port. Meanwhile, the iPod has been sporting docking stations that provide both audio/video out and charging since before the smartphone war.
Sure you can. USB has a dedicated pin for sending power down and it's been an integral part of the design since the start (USB 1.0). How do you think USB pen drives (et al) work when they have chips which need powering? Or USB "soundcards", wireless dongles, USB keyboards and mice? They all need powering and they all need powering while in operation.
You might not be able to send higher charges while using data - I've not followed the spec that closely - but you can supply at least enough charge to keep the phone running even if you cannot recharge it.
The reason you don't get docking stations for other phones is because every sodding phone puts their USB socket in a different place (and often it changes from model to model from the same manufacturer). This means whatever dock you build will automatically be fugly compared to iPhone/iPod docks because you'll need a USB cable. Which is why any such dock will either have a 3.5mm audio jack (TRS) and thus support any portable device, or use bluetooth for a wireless solution.
I think you're missing the issue here. The examples you've given are all with the 'host' powering the 'device'. e.g. Your computer powering your USB keyboard. 'Host' and 'Device' are specific modes of operation in the USB spec, it isn't multi-master.
The problem is when you want the 'device' to power the 'host', as is often the case when the 'host' is something with a battery, such as a tablet. There is no support built into the USB spec for this, it is always assumed that the 'host' provides power to the 'device'. An example of something that isn't supported would be a tablet 'dock' with a bunch of device connectors that could also charge the tablet, all via a single USB link.
> I think you're missing the issue here. The examples you've given are all with the 'host' powering the 'device'. e.g. Your computer powering your USB keyboard. 'Host' and 'Device' are specific modes of operation in the USB spec, it isn't multi-master.
That's the only way you'd ever want to power a USB device though. It makes no sense to power the host from the device. Plus this isn't even the case with iPod docks - they're all powered in the mains.
> The problem is when you want the 'device' to power the 'host', as is often the case when the 'host' is something with a battery, such as a tablet.
The host on an iPod dock isn't the iPod; it's the dock.
> There is no support built into the USB spec for this, it is always assumed that the 'host' provides power to the 'device'. An example of something that isn't supported would be a tablet 'dock' with a bunch of device connectors that could also charge the tablet, all via a single USB link.
In this example, the dock is the host and the tablet is the device. So the tablet would still be charged.
The thing with USB is, it doesn't matter which device is the server and which is the client at the software end as that's just an arbitrary software paradigm. You can have the USB host act as the client at the software end and the device as the sever (eg a music dock will be the host, but the phone is the server and the dock the client), all you need is the two devices to be able to speak to each other. And since USB is a two way protocol (else USB storage wouldn't work), it's pretty easy to have the host as the power source and the device as the item requiring USB power even when the device is the one sending signals back to the host (eg how USB mice and keyboards work).
> That's the only way you'd ever want to power a USB device though. It makes no sense to power the host from the device. Plus this isn't even the case with iPod docks - they're all powered in the mains.
You're still not getting it. I want the dock to power the tablet so it charges, the dock gets power from mains does not change that at all.
> The host on an iPod dock isn't the iPod; it's the dock.
You can't make this statement definitively. If one of the features of the dock is a keyboard, it's going to be acting as a HID interface device and will be acting in device mode....
> The thing with USB is, it doesn't matter which device is the server and which is the client at the software end as that's just an arbitrary software paradigm.
This is wrong. A 'device' can never initiate a transfer, all transfers are negotiated by the 'host'. The 'host' is the bus master and manages all transfers regardless of what they are or where they are going. e.i. USB is a 'polling' interface, not a 'pushing' interface. You can not have more than 1 host on a USB bus.
> You're still not getting it. I want the dock to power the tablet so it charges, the dock gets power from mains does not change that at all.
Actually it's you who's missing the point. I've already told you that that does happen, it's just some handsets require a higher charge than others - and docks will only ever output the lowest required volts/amps because they wouldn't want to damage other hardware that is designed to pull less power. IIRC iPod/iPhone users had similar problems when Apple changed the input power rating for newer devices a few years ago.
> You can't make this statement definitively. If one of the features of the dock is a keyboard, it's going to be acting as a HID interface device and will be acting in device mode....
lol How many music docks act as a keyboard? Plus there's a contingency for that in the spec anyway as you can run more than one USB interface concurrently over a single USB port.
> This is wrong. A 'device' can never initiate a transfer, all transfers are negotiated by the 'host'. The 'host' is the bus master and manages all transfers regardless of what they are or where they are going. e.i. USB is a 'polling' interface, not a 'pushing' interface. You can not have more than 1 host on a USB bus.
You're missing the point. It's just a software specification. Once the devices have linked, you can programmatically design the software to work whichever way you want. As I pointed out before, there's plenty of USB devices that behave this way already; devices that -from the user perspective- both initiate and receive data.
Isn't that limited at 0.5 amperes @5V? I believe it's not enough for all devices (or to charge them quickly enough).
I believe there's a hack in the apple products that can be used to send more current through USB but it's not standard. I might be wrong though, it's been a while since I looked into that.
Also, it's the reason why certain devices (notably external hard drives alimented through USB) come with two USB connectors in a Y configuration to be able to leech enough current from the host. At least that's how I think it works.
As a baseline it's limited to 100mA @ ~5V (150mA for 3.0). Devices can request additional power from the host, 500mA max for 1.X/2.X or 900mA max for 3.0. The host isn't required to be able to supply this much current, it can say 'no'. There are also some quasi standards and I believe now a real standard now for 'USB charging' which allows higher current (and voltage I believe).
Speaking of up/down and Apple, I saw today that Apple's ethernet ports are upside-down. They're supposed to have the contacts at the top, to avoid collection of dust when the port is not in use.
Interesting, I didn't know that! I've always liked the pins-down configuration on my Macbook since in order to release the connector, the clip is on top (that is, I don't have to lift the laptop/etc in order to remove the cable). Connector orientation stuff is fascinating (my favorite being the upside-down configuration of power outlets in new buildings, apparently so that if a plug is partially pulled out, a straight metal object falling atop won't sit on live pins).
Sure, but once you've chosen to orient it a certain way for a large class of devices, you had better be consistent, especially if one needs to remove it without being able to see it. The last thing you want is users complaining about how their MacBook Pro is one way and their iMac is another.
From a user-perspective and not considering cost, Lightning is undeniably superior. With Amazon Basic and Monoprice options getting closer to $10, cost is becoming less of a factor. I would much rather use the superior option. Inserting Micro-USB many times per day is maddening. Who wants that?
I only ever had one mini USB cord go bad, and that was a minor pain. But I have had the micro USB port on a cell phone become unreliable, which is insanely frustrating. The post holding the contacts no longer wants to push them against the cord, so it frequently loses the connection.
I can solve the problem temporarily by pushing the post back with a small screw driver, but "temporarily" here is measured in minutes or hours.
Exactly the same experience here, except that I never had a mini-USB failure. After roughly 6 months of use, the USB receptacle of my LG smartphone was just not holding on to any cable anymore. I sent it back to LG, and they replaced the part containing the mirco-USB port.
On an unrelated note: LG noticed that I was running a custom ROM but didn't make any problem of it, while technically they can claim that such an action voids my warranty.
Actually, if you're in the US the Magnuson-Moss Warranty Act covers you in the case of customizations [1]. Under this law, manufacturers are only able to claim your warranty as void if the modifications are what caused your warranty claim. I doubt they would be able to claim that your custom ROM caused your hardware failure on the USB port, so legally they had to fix it. More info on this FTC law [2].
I'm in Europe, but it may well be that a similar law is in effect here. However, a lot of consumers (me included) wouldn't know about that, and companies try to get out of repairs that they are legally required to perform.
Take a look at Apple's warranty: they claim to provide only 1 year of warranty, while European law requires them to provide two years. They will repair your device in the second year (it's the law!). However, I bet their confusing labeling etc. will cause a lot of people to believe their warranty has expired. Those people will either not ask for a free repair or pay their shop to repair the device.
In the US, Magnuson-Moss is often referred to as "the lemon law" (a defective product that can't be repaired is colloquially called a "lemon"). Specifically, "the lemon law" refers commonly to cars that have issues the dealer didn't specify before the sale and will not/can not fix.
Problem is, the FTC requires you to have a lawyer file a lawsuit and coordinate your legal costs and representation with the FTC, generally requiring a lawyer specifically dealing in lemon law cases. I've been down this path with a defective cell phone (HTC Touch Pro) and trying to get my carrier to replace it, and it's impossible to find a lemon law lawyer that deals in anything besides used car sales.
As always, the law is well intentioned but enforcement is next to impossible, just like you said.
I'm not clear on the specifics of the actual law, but this seems like what the Magnuson-Moss Warranty Act was trying to protect.
There's no link between a physical hardware failure and flashing an aftermarket ROM, and as such they would be really hard pressed to have a legally valid reason to deny you coverage for the micro-USB port being defective.
There's a lot of grey area - if your display goes out, maybe it's because of software - but something purely physical is pretty clear cut.
I couldn't agree more. The mini-USB connector was solid and well guided, while the micro-USB connector's male component is flimsy and the housing (as others have stated) is vague (I have to use a bright light to be able to see which side has bevels).
I'm glad I have a touchstone (inductive charger), as my touchpad (with micro-USB) already only charges with a couple cables (and I use the touchstone 90% of the time for charging).
My N900 (also micro-USB) is even worse... very flakey trying to get it to actually take a charge, pretty much since day 1. I'm lucky though, as many (MANY) have had their micro-USB port rip right off the board on which its mounted due to the "clips" that are supposed to be such an improvement:
I've had the opposite experience. On three different phones I had with Mini-USB the port would always break. I'd have to warp the cord around or rest the phone on the connector to make contact and charge.
All my phones since have had Micro-USB and I haven't had any issues.
Though it really feels like Micro-USB is flimsier, it doesn't seem to be that way in practice.
Probably because by the time it became usual to plug them in often (read: portable computers), they were already commonplace to a degree that made it impractical to replace them.
And nowadays there’s Wifi/UMTS available for most such usecases – I have a docking station at home and use the Thinkpad’s internal Ethernet port maybe twice a year; mostly relying on wireless data transfers when not at home.
gabifrons: I hope you see this; your account has been hellbanned, apparently for the comment you just made in this thread. I agree with you on the N900 USB charging flakiness; it's what made me give up on mine.
Important point - back in 2007, China mandated that all cell phones be able to charge from a standardized connector [1]. The connector they chose was Micro USB. Thus regulatory concerns played no small part in the mini vs micro wars.
Another point: AFAIK the Mico USB connector is designed so that if you apply force from the side to a plugged in connector, the plug will deliberately break and the socket/pcb will stay functional.
Edit: I can't relate to those who say Mini USB is more reliable than Micro USB. I've had a lot of trouble with Mini USB connectors and especially cables, especially if I move a connected device i've had lots of short disconnects. This never happened with Micro USB. So.. go Mico USB!!1!
Yep, I have had several mini USB ports quickly destroy themselves, to the point that it affected my buying decisions. The accepted answer in the link perfectly explains why.
I thought that another big advantage of micro USB over mini USB was that it's a lot easier to combine type A & B. The difference between A & B is most obvious in a full size USB cable. The "A" plug is the flat side that goes into your computer, and the "B" plug is the square one that goes into your device.
The differentiation made a lot of sense when USB was exclusively master-slave. However, the addition of the on-the-go standard (OTG) means that things like a cell phone can function as either a master or a slave. Thus the connector on your cell phone is likely a micro-AB connector that can accept either a type-A or type-B plug so you can connect your cell phone to both computers & storage devices without requiring two different ports.
However, a double check before posting indicates that a mini-AB standard does exist. So that's not the reason. I'll post anyways because I think it's useful information on its own...
I still wish they would have standardized on 2.5mm jacks for usb. You can plug it in any direction, and you can pull a cable out of a tangle of wires without snagging due to the round shape. It is a superior connector.
Never use this style of connector/jack for supplying power. The cable tip will make contact with both the jack ring and sleeve in most jacks, and the cable ring will make contact with the jack sleeve. No matter how you wire the cable/jack, you always have the high probably of short on insertion/removal.
With electronics on both ends, the devices could easily negotiate in which direction power should flow, how much, and detect open and short conditions.
So, tell us how you expect this task would be achieved. (I know some possible ways, but I could just as easily say "but none are as effective as using the right cable for the task!")
I've had more micro-USB failures than mini but good connectors are expensive to make so part of the problem may be that micro is better in theory but harder to manufacture in practice.
USB 3.0's highest speed (termed "SuperSpeed" to easily separate it from the previous, equally clearly labelled, speed grades "LowSpeed", "FullSpeed" and "HighSpeed") requires more pins than USB 2.0 speeds, so the connectors are different.
You're getting some confusing answer. The simple answer is NO: the USB3 "superspeed" signal is carried on different pins. USB low/full/high speeds carry data in both directions on a single pair of differential wires. SuperSpeed has a separate pair for each direction, and it doesn't share anything with the existing D+/D- lines.
Basically, a "USB 3" connector has two completely distinct signaling environments. It's basically two cables in a single bundle. This is in fact literally true for hubs: a "USB 3 hub" is actually implemented as two distinct electronic devices: one working on the old lines and one on the new.
That said, there is an extension on the Micro B port that allows the extra SuperSpeed pins to be connected in such a way that the port remains compatible with old connectors. I've never seen one in the wild.
Since those parts tend to break easily, it's cheaper to replace the cable rather than the smartphone/device having the female port.
That being said I'm still a bit mad at Apple for ditching their old large port, only to replace it with an other proprietary standard.