Tangent: I know that that comment is more of a need for self-deprecation than an actual opinion, but the new brazilian standard plug is not a disaster at all. It's a clever, much safer design, based on an international standard. Every product made in the last half-dozen years fits it, and by now most commercial spaces and every new residence has them. Lately I'm having more trouble with sockets/extensions that aren't updated than the opposite...
It was a disaster. The safety concerns it addressed were a statistically insignificant source of danger, especially not one that warranted changing all the plugs across the entire country.
I have no doubt in my mind that corruption was involved. I bet you a lot of lobbying money by plug makers went to converting the standard. There aren't many major manufacturers of those plugs and they all have a huge vested interest in seeing every single plug in the country get switched over.
There was absolutely nothing wrong with using standard three prong plug that is standard in the US.
Anybody happen to know if the lightning port, in theory at least, can support Thunderbolt? In the future, I'd really like to be able to plug in a nice A/D audio converter to the iPad and use it as a DAW.
If it's not theoretically possible, then the lightning port really isn't future proof.
I don't believe that's possible. Looking at my Thunderbolt cable and ports, I notice: 1) The cable is not reversible. It's a shaped barrel to fit one way. 2) The cable has many connection points (I think I count 10) above and below the inside of the barrel at the cable's end. 3) The inside of the cable is an innie and the inside of the port is a outie, opposites of Lightning (Lightning doesn't have this shaped barrel thing going on, the cable's just an outie with connections on both sides, so the port's just an innie.)
And I also know from other reviews that Thunderbolt cables aren't just powered with electricity, but powerized (?!) with processors at each end. My fingers suspect this is true because the ends get damn hot.
That's exactly the kind of thinking that the original article is arguing against! It doesn't matter what shape or size or gender or even the number of pins a connector has!
The biggest restriction on getting Thunderbolt on an iOS device is not going to be the connector or cable. It will be the controller chips necessary to support such a fast protocol. And fast enough flash chips to provide the data quickly enough.
Lightning can act as both a USB controller or peripheral. It can already provide or receive power. The active elements inside Thunderbolt won't be the limitation.
You'd essentially have to pack a PCIe controller inside the iPad for it to work, so (for now) not without a few design concessions from Apple. And it's Intel's interface; good luck getting them to support it with ARM devices.
What's the good thinking on Apple's part? Being compatible with USB3 by copying the mechanism that MHL uses to connect microUSB connectors to HDMI and DP?
It's good thinking building a completely generic connector that doesn't adhere to only one standard. So that if <insert new standard> comes on the market Apple could support it without a change of connector.
Apple's connector isn't going to automatically support any signaling they decide to add in the future either. It has to have the proper transceiver in the device as well. If they do decide to add new signaling formats, they'll end up with the issue of newer devices supporting them and older ones not supporting them, which will be rather confusing. The same problem currently exists with HDMI/DP signaling over MicroUSB, though.
The bottom line is that they could have done this just the same with a USB port. In my opinion, all they really got out of designing their own connector was: a) something they can license, control, and collect royalties on; b) a connector that's slightly easier to use (the ability to plug the cable in either way, etc.).
Honestly, I don't think that's their reason for choosing something proprietary over usb. I doubt their first priority was to create something so that they can license, control, and collect royalties on -- if that was their motivation, they would be doing that to the audio jack too. After 10 years of their 30 pin connector, I'm sure they've collected a good amount input from manufacturers & users and engineered something to fulfill those requirements.
I feel like people tend to have these same knee-jerk reactions to anything Apple-designed; they always feel it was designed for the purposes of collecting royalties because Apple is "evil".
I think it's a bit too early to tell if the device connector will lead to confusion over "old" and "new" lightning supported devices considering nobody has released any lightning devices yet. Your FUD is based on existing intermixed standards that have nothing to do with a connector design created by a single company.
Would be interesting if Apple put Lightning ports on their MacBooks, and then sold Lightning to Lightning cables. Then there would be no restrictions from the protocol at the other end of the cable.
It'd be hard to justify making space for an extra port on the MBA that's only useful for one specific device; and the gain over USB would appear minimal. Never mind that Apple is trying to sever the PC dependency in iOS, making a dedicated Lightning jack of even more questionable use.
So if I understand this correctly, this system uses a male connector as a power OUTput? Does that mean every charger must be intelligent enough to properly detect shorting if someone touches the lightning tip to a steel table? Is that when the lightning happens? What about the immense market for poorly made non-intelligent third party devices without proper safety circutry.. doesn't an exposed male power output open up a huge danger door?
> The pins on the plug are deactivated until after the plug is fully inserted, when a wake-up signal on one of the pins cues the chip inside the plug. This avoids any shorting hazard while the plug isn’t inside the connector.
I'm guessing that there's a different mechanism for power plugs, since they have to work even if the device is dead. Yes, there's always a market for poorly made third-party devices, but poorly-made devices have always had the potential to be dangerous -- scroll back to the top, and you'll see a dangerous, poorly-made adapter that manages to be dangerous even though the power is always transmitted from female jack to male plug.
Anyone can create a poorly made adapter or a poorly made plug. My question is why would an industry leader put an elementary risk at the forefront of a new technology? The pins should be shielded.
Third party garbage is inevitable, and too often very difficult to tell apart from authentic merchandise. In electrical current even in low amp situations idiot-proof should be the critical part of the design.
A power adapter which can't handle a short is dangerous, period. Handling a short should be a part of the design of the power supply. Break open a low-quality third-party power supply for the last generation of iPhone and you'll see some downright dangerous, possibly lethal stuff. These dangers aren't really even mitigated by putting a shield around the plug.
Notice that on the shoddy power supply, the isolation between mains voltages and the other side is about 1mm. Manufacturing tolerances and component faults could easily bridge that gap and electrify the entire low-voltage side with lethal 240V mains -- shield and all.
Idiot-proofing low-voltage plug design doesn't help, because you've already lost if you let idiots design anything that plugs into the wall. Okay, shorting the plug might blow out your knockoff power supply and possibly start a fire, if the current limiting circuitry fails (which it shouldn't). But that knockoff was going to electrocute you anyway, at least if it lights on fire you still have the chance to run.
Micro-USB has one tremendously major disadvantage for battery-powered mobile devices: Maximum current.
USB 2 has a maximum current of 1.5A while communicating and 5.0A when not. Micro-USB has a maximum current (per spec) of 1.0-1.8A, and the usual rating for most connectors is 1.0A. The result? By switching to Micro-USB, your iDevice will now take significantly longer to charge. (However, keep in mind the iPhone charge adapter is 1.0A, whereas the iPad adapters are 2.1A).
So in exchange for a standardized connector, we would have devices that take twice as long to charge. Honestly, that's not a tradeoff 99% people would want to make.
The new USB 3.0 power delivery spec supports 3A at 20V over micro-USB for potentially 60 Watts delivered via micro-USB and 100W over USB. The idea is to get rid of laptop charger plugs entirely and replace them with USB. A bunch of companies are on this standard, including Dell, Foxconn, HP, Intel, Microsoft, and Nokia. Apple is absent for some reason. This standard will also allow external hard drives, printers, etc to be powered over USB without a separate power plug.
This will be an interesting change if the standard catches on. I wonder about the failure mode of 60 watts through micro-USB though - I expect dramatic smoke and flame :-)
Could the "some reason" be that the USB plug is not a mag-safe plug? I for one would definitely not welcome going back to a different type of power connector.
The original USB spec had a maximum current of 0.5A, which manufacturers like Apple quickly exceeded when it suited them. The spec was then adjusted to follow practice.
The same can and will happen with the microUSB spec.
I quite like Apple's new connector - particularly the new bi-directional insertion. However, I've never had an issue with MicroUSB, inserting or removing. Care to elaborate? I've used them on at least 20 devices, if not more.
I can elaborate -- the microusb plug is nearly identical on top & bottom, yet can only inserted in one orientation. That means if you're not looking at the plug, you won't be able to insert it. You don't have an issue with microusb because you always look when you plug something in.
When was the last time you looked at your headphone jack when you plug it in? It's probably second nature and you always get it right on the first try. Most microusb plugs have different "sheaths" so you can't remember the "feel" of the plug and insert it without looking at it.
The regular usb plugs are the worst offenders, they look identical on top & bottom. The only way to tell if it's the right orientation or not is to attempt to plug it in. If it doesn't go in, you flip the plug over and try again. It's a minor hassle, but it's something that could have been designed better.
Ever microUSB cable I have differs significantly top and bottom:
- The top has the USB symbol that is embossed enough to feel it without looking.
- The top of the metal sheath has "hooks" or other indendations that are both plainly visible and can be felt with a finger tip.
- The top of the metal sheet has sharper corners, while the rounded bottom corner are easy to both feel and see.
This is across probably half a dozen different brands of microUSB cables - I have sets of cables permanently in charges in 4 rooms at home, at work and a few cables in my bag.
Every cable is different for me. My Evo4G cable has no marking on either side, my jawbone usb cable has an embossed jawbone logo on the top, the kindle cable has no marking whatsoever, my Nexus S cable has nothing on either side, my monoprice microusb cable is also blank, and my external hard drive usb cable has the logo on the bottom.
They're all different for me, so YMMV. I still stand by my argument that bi-directional insertion cables are still easier to use than usb. I don't have to "feel" any part of the cable when I plug in my audio jack.
Anyone you talk to will have attempted to plug in a microusb the wrong way at least once, and had to flip the cable to retry. I've yet to talk to anyone who plugged in their audio cable improperly -- there's just no way to improperly do it! That's the difference between good design and bad design. These are basic elements of design that reduces the user's need to think when using the device. For example, a door with a pushbar means that one should push the door to open -- a user would not examine the door frame to decide if they should pull on the pushbar or not; it's obvious!
That's great but all the devices that I have with micro USB have a different definition of top and bottom. For example, on my Kindle the "top" side is the same side as the screen while on the Nexus 7 the "bottom" side is the same side as the screen. To top it all off the male plugs seem to be a touch different also. The Kindle plug fits fine in the Nexus 7, but for some reason the Nexus plug doesn't really fit snug into the Kindle.
I despise micro USB and feel sorry for my brethren in the EU. It's great there's a universal charging standard, but it sucks that they chose the crappy micro USB format.
> "The regular usb plugs are the worst offenders, they look identical on top & bottom."
If you go by the spec all USB plugs should have the USB 'trident' logo on the top side of the plug. The overwhelming majority do - not that this will help you, because whilst it sorts out the problem of the plug orientation the socket still remains an unknown.
The logo being on top doesn't help either. The usb ports on the side of my monitor are vertical. I can never remember if the I should be plugging in something with the logo facing me, or away from me.
My basic argument is that cable connector design hasn't changed in 30 years. Back when people used 9-pin serial cables, they still dealt with this problem the same way. There's room for innovation here, but nobody is taking a bite.
Most of the manufacturers emboss the USB logo so it can be felt in the dark.
Even better is how Logitech did it, with a huge bumb on the top. I'm seriously tempted to order a bunch of these[1] and throw out all of my other microUSB cables.
Either most of my chargers lie, or 2A isn't the problem he claims for microUSB.
I'm inclined to think the latter, as if I put my display on top brightness and play any games that uses my phones GPU, I get warnings that my charger delivers less current than needed to charge if I "only" use my 1A rated chargers.
The Nook Tablet's included micro-USB charger is 2.0A, and the Nexus 7 also includes a 2.0A charger. Both also charge (slower) on standard lower amperage micro-USB chargers.
The article explains how micro-usb connectors can be used to connect to both HDMI and display-port. And obviously USB2 can be connected to USB3. All the article is saying is that Lightning could be used to transport USB3, even with only 8 wires.
So the only advantages for lightning are that it's reversible, and it makes Apple scads of money.
Did you even bother to read the article ? The advantage of Lightning is that it is a REAL future proof connector. Completely generic and separate from any current standard. A connector that could still be in use 10-20+ years from now.
And the fact you brought up the money argument just shows you have no idea what it is going on.
I think the money is an important aspect because Apple uses patents on connectors to extract more money out of their users. The perennial favorite is the Magsafe power adapter connector but realistically ever since that connector Apple has gone out of its way to create a patent wall around their cables. I'm sure they justify this by saying it allows them to control the user experience, after all a shoddy cable really sucks, but its not clear that if you gave people the choice they would choose $29 'reliable' cables over $2.99 'might break' cables.
Apple has been pretty aggressive about patenting their connector designs [1]. So sarcastically I'd say this could be in use exactly 20 years from now :-)
One problem with a $2.99 cable isn't that it "might break," it's that it might zap you with a "potentially-lethal 340 volts" among other less severe but still significant drawbacks (which most people wouldn't ever consider).
It's one thing with chargers, and another with cables. Apple's cables are actually worse quality than the cheap ones IMHO, since putting a proper strain relief on the cable violates Apple's design sensibilities.
I almost gottend zapped by one magsafe connector, and almost had another one cause a fire due to this. Apple of course insisted this was not their fault, until the next replacement charger "magically" had a 2-3 times as long strain relief.
You could just as easily get zapped with a "potentially-lethal 340 volts" with Apple's cables. Notice how that fake charger has a USB port to allow you to connect your existing iPod or iPhone charger cable to it, including a genuine Apple cable.
(I suspect that the chargers with integrated micro-USB cable are, for the most part, quite a bit safer due to not imitating Apple and having plenty of space. I actually took a dirt-cheap Chinese 5V supply apart a while ago because it had died - proper controller chip and all the works, good half-centimeter of isolation between high and low voltage, slot in the PCB along the part of the isolation boundary where they're closest, etc.)
I very much doubt having a "real future proof connector" is going to make any practical difference for most users.
Having to have different cables from pretty much everyone else, on the other hand, will.
In my house there are at least a dozen microUSB cables. There's 2-3 Apple cables. That reflects the relative ratio of devices that use each, and it means charging or connecting anything that requires microUSB is trivial - there's always a cable at hand. There's half a dozen microUSB cables within a one metre radius of where I'm sitting at my desk right now.
Won't matter if you're at home and have a nice collection of Apple specific cables. But it is likely to make far more difference for people when they're out travelling for example.
Having a "future proof" connector? Not going to matter for anything but a tiny sliver of early adopters who might otherwise need adapters now and again (e.g. MHL adapters for micoUSB now, that are quickly becoming obsolete as TV's etc. start getting MHL support built in)
Well if 30pin was as future proof, then in theory we would not have a problem with angry people with lots of old accessories right now. And if it as good as they say, then if tomorrow HDMI is totally doomed and there's shiny new HDMI2 standard completely different from HDMI, I will not have to change the phone, apple just needs to release a new cable and possibly software update.
You mean the article that explains how microUSB has become compatible with new standards or the article that explains how Lightning could become compatible with new standards?
It seems to me that having done something twice in the past is a better proof of being able to do that something than speculation is.
From the article
"Common to both MHL and MyDP is the need for an additional transmitter (driver) chip as well as a switcher chip that goes back and forth between that and the USB transceivers. This, of course, implies additional space on the device board for these chips, traces and passive components, as well as increased power consumption. You can, of course, put in a micro-HDMI connector and drive that directly, that would save neither space nor power."
That's not microUSB becoming compatible - It's having to add hardware inside the device which makes use of microUSB's pins. The Lightning connector seems to push that to outside of the device i.e. make the right cable/adapter and software and you can adapt to anything without requiring support from specific hardware inside the device. At least that is what he seems to saying and it'll probably take some more analysis and dissection (not to mention actual adapters) to support that assertion.
If you're a company working at the scale Apple does I don't think there's anything stopping from integrating the driver and switching hardware for MHL onto your main SoC.
microUSB hasn't become compatible with new standards, it has been hacked to allow the pinouts to be repurposed for new standards.
The implication of this article is that the Lightening port can be repurposed, without changes to the device's hardware, simply by providing a new cable and software.
If the iPhone5 doesn't have internal USB3 or HDMI or DP support, it can't magically gain it through the use of a new cable and new software. It's probable the iPhone6 will support all or most of those, but that's because they changed the device's hardware...
Maybe I was the only one in the dark, but if you've not seen it, this cable is damn cool (yes, I'm a geek).
It's not "hollow" like USB cables, where the "male" plug is actually a "female" plug in the sense that there is a negative space in the usb connector that is filled when it's plugged in. With the Lightning port, it's just a male nub with the pins on both sides. Meaning you don't have to worry about orienting it correctly.