Apple probably wouldn’t have changed to usbc for their phones. Lightning was a mobile phone / other development, whilst usbc and its contributions came from their Mac department.
They did not like each others standards. I know Apple engineers working on the phone who dislike the change even up to this day…
USB-C is a worse mechanical connector for a device plugged in thousands of times over its lifetime. The female port of a USB-C connector has a relatively fragile center blade. Lightning's layout was the opposite which makes it more robust and easier to clean.
> USB-C is a worse mechanical connector for a device plugged in thousands of times over its lifetime.
USB-C connectors are usually rated for 10k cycles. Do you have any evidence that lighting connectors are rated for more cycles than that?
> The female port of a USB-C connector has a relatively fragile center blade. Lightning's layout was the opposite which makes it more robust and easier to clean.
This is very weak a priori arguing. I could just as well argue that USB-C has the center blade shielded instead of exposed and so is more durable.
Unless you have some empirical evidence on this I don't see a strong argument for better durability from either connector.
I did wind up replacing the USB C ports on a 4 year old computer recently because it was dodgy as hell. When i got it under the microscope it the longer bus power pin contacts (and one or two of the others) had been badly worn/squished/stretched in a way that I guess was causing them to bridge to other pins. I assume some USB-C cable had some gunk in the connector which was hard enough to damage the contacts on the center blade, and the user didn't notice (because how often do you look into the end of your USB-C cable?). It probably presented as a cable that wasn't seating right or didn't go all the way in and whatever was inside probably fell out when it was removed and they tried again.
And for what it's worth, damage to the center blade does seem to be a common failure mode for USB-C and mini-usb connectors. Less frequent for something like HDMI but it does seem to happen from time to time. Lightning never felt like it locked in as securely as USB connectors do, but at the same time, every time I saw a damaged lightning connector it was always on the male (and therefore usually cheaper accessory) side.
I've had multiple USB-C chargers broken like this.
Now, admittedly, "being yanked by a robot vacuum and falling on the ground" is outside the design parameters for a port; but I absolutely had USB-C ports fail in a way that Lightning would have not.
(Not the person you're replying to, but also a "Lightning was a better physical connector than USB-C" weirdo.)
I have seen multiple USB-C ports break on Lenovo and HP laptops. About 1 in every 50 laptops over the span of 2-3 years. I don't know if it was the users fault or a manufacturing issue. But the manufacturers fixed these under the extended warranty.
It might be an issue with the USB-C port used in these laptops since the ports on MacBooks feel less wobbly to me. But in the end this is just speculation and anecdotal.
At the same time, if the springs on the iPhone-side connector loosen and can't hold onto the cable, you have to replace the whole phone and not just the cable.
So Apple had to use pretty strong springs, resulting in a lot of friction on the pins. That made them easier to damage, so they had to switch from gold to a crazy super-resistant rhodium-based alloy for contact coating.
My Pixel 8 certainly hasn't gone through 10k cycles and it barely holds on to any USB-C connector I put inside it. They all fall out even when laying still on a flat surface.
There's always outliers, of course, but I had this issue with USB Micro-B on at least one other device and never saw it with a Lightning connector.
I find it's often lint in the USB-C port. Cleaning it out with a non-conductive tool like a toothpick or a dry toothbrush usually solves it for me when that happenens.
I've had dozens of devices with USB-C. I've yet to have even a single one that had any problems with them. To be fair, I'm using iPhones mostly for app testing, so I also had very few issues with them.
Your Pixel 8 could be about two years old. The connector performed way under spec and you should send it in for repair (assuming your are in a country with a 2 year warranty period)
This is probably lint buildup. You can scrape it out with any thin and stiff object like a safety pin.
A small amount of lint gets into the hole. You pack it in when you plug in the cable. Repeat a thousand times and now you have a stiff “plug” of lint that prevents the connector from fully entering your device.
My experience is that plugs from the same manufacturer as the device tend to keep holding tightly, but mixing makers is unreliable. Apple plugs in particular tend to slide out of my samsung phone really easily. I guess whoever speced usbc didn't bother with the details of how it would stay in, and every manufacturer figured out their own solution.
The 10K cycle insertion rating for USB-C is an idealized metric that does not include lateral force, torque, device movement, or real-world wear patterns. These non-axial forces are a known cause of USB-C port failures and are explicitly not accounted for in the standard 10k-cycle durability claim.
USB-C center tongue female design means that the port will break before the cable. With lightning, the cable plug takes all the stress.
Apple doesn’t publish insertion cycles rating for Lightning connectors, so it’s impossible to provide empirical evidence of that.
In my personal experience, I’ve had two USB-C ports go bad on two MacBooks. I’ve yet to own a USB-C-charging phone, but I’ve never had a Lightning port fail.
> These non-axial forces are a known cause of USB-C port failures and are explicitly not accounted for in the standard 10k-cycle durability claim.
I agree and that's par for the course for any standard, they have to limit the requirements to something that is economically manufacutrable and testable.
Meanwhile, lightning connectors have no public standard to speak of so this is a mute point.
> USB-C center tongue female design means that the port will break before the cable. With lightning, the cable plug takes all the stress.
This is another a priori armchair expert argument which I just put very little weight on without data to back it up.
> Apple doesn’t publish insertion cycles rating for Lightning connectors, so it’s impossible to provide empirical evidence of that.
That conclusion does not follow. We can still obtain empirical evidence through direct testing without Apple publishing anything.
> In my personal experience, I’ve had two USB-C ports go bad on two MacBooks. I’ve yet to own a USB-C-charging phone, but I’ve never had a Lightning port fail.
That's fair, everyone has different anecdotal experiences as a foundation for their opinion here. The problem is that anecdotal data is just not very informative to others, that's all.
> USB-C center tongue female design means that the port will break before the cable. With lightning, the cable plug takes all the stress.
Are you sure it's the center tongue which takes all the stress, and not the round shell?
AFAIK, USB-C is designed so that the cable breaks before the port, because the parts which wear the most with use (the contact and retention springs) are in the cable, not on the device.
Incorrect. You want springy bits on part that is easily replaceable - the cable. USB-C does that, the springy bits are in the connector, not the socket.
My phone is now 6 years old, zero problems on usb-c connector
Groan. Come on. Cite one. A single "Apple engineer" to support this ridiculous claim of insider knowledge. What year do you think it is?
You understand that the SoC and I/O blocks are largely shared between the Mac and the iPad / iPhone now, right? This invention of some big bifurcation is not reality based. The A14 SoC (which became the foundation for the Mac's M1) had I/O hardware to support USB-C all the ways back to the iPhone 12. Which makes sense as this chipset was used in iPads that came with USB-C.
Pretty weird for hardware that is largely the same to "not like each others standards".
Well sure, they're iterating between models. But in many cases they're quite literally copy/pasting designs. Any imagined separation between the hardware teams is fantasy based. The comment I replied to is nonsensical.
"They're different even between A19 Pro in an iPhone Air and the one in 17 Pros"
The SoC and I/O blocks are quite literally identical. An A19 Pro is an A19 Pro, aside from binning for core disables. The difference is in the wiring and physical connector on the device which puts a ceiling on the features supported, one of which is 10Gbps. The Air famously includes some new "3D printed" super thin Titanium USB-C port, using the 4 pins rather than the "pro" 9 pin 10Gbps capable connector. The SoC is identical, they just only wired it up for USB 2.0.
Truetax | Senior/Staff Software Engineer - Elixir/Phoenix LiveView | San Francisco (Hybrid) or REMOTE (US) | Full-time
Truetax builds software to streamline government tax administration and make society more equitable. We're working with state and local governments to modernize their tax systems using GenAI agents and workflow automation.
We're looking for a Senior/Staff engineer who can ship fast while maintaining quality. You'll work directly with our co-founders (experienced entrepreneurs with multiple venture-backed exits) to design and build cutting-edge GenAI-powered systems.
If you're curious, self-motivated, and energized by complex government workflows, we'd love to hear from you.
To be clear, Electron themselves fixed the bug quite quickly; but many Electron apps haven't pushed a version that vendors in the fixed version of the Electron runtime.
(And shit like this is exactly why runtimes like the JVM or the .NET CLR are designed to install separately from any particular software that uses them. Each of their minor [client-facing-ABI compatible] versions can then be independently updated to their latest OS-facing-bugfix version without waiting for the software itself to ship that update.)
Apple is consistent in their warnings to not use private APIs, and especially don't override them for custom implementations which is what Electron does here.
The _cornerMask override was a hack that shouldn't ever have existed in the first place, and it's not the only use of private APIs in the electron code base.
Apple is very clear about how they want you to make software for their OSes. It's 100% on electron that they choose to do it this way regardless.
I'd go as far as to say Electron itself is a hack that shouldn't exist, but sadly everyone has decided it's the only way they are going to make desktop software now.
Broad support for many different chips is precisely why Arduino is so bad. It has to check pin numbers against a gigantic table for every gpio call.
You want chip-specific libraries. When the software is designed for the hardware everything works better.
The native AVR and esp-IDF frameworks are very good. There's also micropython and circuit python. I've heard good things, but I don't partake in Python.
Personally I think attempting to provide a cross-platform library for microcontrollers is an enormous mistake. This is not x86, you can't rely on any CPU feature existing, which results in awful branching code in places that in a sane framework is a single instruction updating a CPU register
I feel like this has to be a toolchain issue, there's no reason the pin number -> register table couldn't be resolved at compile time, similar with conditionally compiling certain things based on the CPU features.
I'm not saying it's not a real or an easy problem, just that I wonder if it truly is the reason Arduino is "bad"
It could and some cores do. Many do not and you get a runtime lookup unless you explicitly call digitalWriteFast which is also supposed to resolve to a single inline instruction. It usually does not and instead emits a function call in assembly.
The gpio thing is really just my personal pet peeve. There are a lot of things like this though. For example, the arduino core will consume several milliseconds doing something in between calls to your main function. I2C and similar drivers are typically not well designed and either use too much memory or just operate not-quite-right.
Which brings up another point, the Arduino ecosystem is not at all unified. If you use a chip that is not popular enough to be mainlined, you have to go out and find an Arduino core that supports it and try to plug that into your compiler. Such cores frequently are not API compatible and have slightly different behaviors. It's all a big mess.
There are a lot of features that are compile time conditional based on CPU, but the actual implementation of this is horrible. I once had to modify someone else's custom Arduino core to tweak some low level behavior and despite the change being very minimal, it took three days to find all the places and all the conditionals that needed tweaking.
But really my main complaint is that Arduino is incredibly slow and hides far too much from you. Firmware developers should know about CPU registers and hardware features. This is very important for understanding the machine! A lack of awareness of the machine and what its doing is (IMO) one of the major factors in how awful modern programs are.
I agree with you, with the caveat that the awful software that's written by an inexperienced programmer ends up getting used, and the perfect efficient well-tuned software I want to write never gets finished (or even started, usually). It's so much more work.
If you want a "framework", Zephyr is the only thing i can think of, that is somewhat hardware agnostic, have great software packages, and fairly widely used.
Both indeed. I'm older, I do consulting, often to the new school AI CEOs and they keep thinking I'm nuts for saying we should bring in this person to talk to about this thing...I've tried to explain to a few folks now a human would be much better in this loop, but I have no good way to prove it as it's just experience.
I've noticed across the board, they also spend A LOT of time getting all the data into LLMs so they can talk to them instead of just reading reports, like bro, you don't understand churn fundamentally, why are you looking at these numbers??
I talked to a friend recently who is a plastic surgeon. He told me about a young pretty girl came in recently with super clear ideas what she wanted to fixed.
Turns out she uploaded her pictures to an LLM and it gave her recommendations.
My friend told her she didn’t need any treatment at this stage but she kept insisting that the LLM had told her this and that.
I’m worried that these young folks just trust whatever these things tell them to do is right.
I've been using elixir / phoenix / liveview for a year now, basically since LLM coding has been a thing and it's been transformative. The usual "getting started" problems were so diminished that i feel like i hardly missed a beat. The usual "this won't compile / how do i do this in a new unknown language" issues that previously could have taken hours to resolve were basically gone. My LLM pair programmer just took care of it.
Coming from python / django / cue, it's a breath of fresh air. It's so much easier as all the paradigms come built in with the stack (async workers, etc). The elixir / erlang library is surprisingly complete.
With regards to producing code, it seems to be doing very well. The most impressive thing it did for me was a PDF OCR from scratch using google cloud. All i had to do was plug in my credentials, hook up the code and it just worked. Magic.
No reference here but found this out the hard way too. Google search Ali is Utterly useless in fact and entirely different search results vs using the web. Bing is better. Haven’t tried ksgi yet
It’s actually much more that the ozone layer, which filters uv, is much thinner than it was even 60-70 years ago. The ozone layer might be growing again, but at a very slow pace.
Simple fact is, we’re much more exposed to uv than prior generations.
They did not like each others standards. I know Apple engineers working on the phone who dislike the change even up to this day…