Hacker News new | past | comments | ask | show | jobs | submit login

> Agree. People are looking at the past with rose tinted glasses or haven't lived through the '90s - '00 tech. Every phone and electronic gadget had their own proprietary dock, cable and connector for data and charging. Even many phones had proprietary connectors for headphones instead of the 3.5mm jack. It was a nightmare.

You've missed the problematic point of where we are now. In the mid 2010's, apple had lightning, and android was almost certainly micro USB. If I had a USB port, and a cable that fit, it would charge my phone effectively. Somewhere in the transition to USB-C ,we lost that.

> What we have today has certainly teething issues, but we're on the right path.

I disagree - we've missed the forest for the trees. I have 4 mains to USB adapters in my home, and 2 USB-A plug sockets. I also have 4 USB C-C cables, and 2 A-C cables (which stay in the wall sockets). I use these to power 2 phones, 2 pairs of earbuds, an M1 Macbook and and iPad. If you pick an arbitrary cable and an arbitrary power adapter and plug it into a device that fits, it will do anything ranging from not work at all (with my anker wall charger and any cable into both sets of earbuds), up to charging it so quickly the device oveheats (140w USB c charger into either phone). I've got 6 devices, 4 cables, and 4 plugs that all have the same connection points that just don't work properly together. Meanwhile if you go back to the mid 2010's as per earlier, we had a lightning cable and a micro USB cable - you knew it worked if it fit.

That's before you get into the nonsense around Android auto. I have a car with AA in it's head unit, and a USB port for connectivity. I must ahve tried 5 different branded cables before I found a reddit post that linked a specific anker cable that works for my very specific combination of car and device - I _never_ would have figured that out on my own.




> up to charging it so quickly the device oveheats (140w USB c charger into either phone)

That isn't how USB-PD works / a problem with USB itself, The device being charged controls the rate of charge: Sinks (in this case, your phone) request a voltage and current from the source, going off of a list of what the source reports that it supports. If the phone can only support say 10W charging it's going to request 10W of power regardless of how oversized the charger is.


those sorts of fast-charge speeds are incredibly bad for the battery regardless of whether the device will let it do it - I think this is something the EU should step in and regulate tbh, because that's a huge vector for e-waste. At best you're changing the battery much more frequently, which is still e-waste, and often those devices are ending up in the trash because apple is the only vendor with a serious battery-replacement programme. third-party batteries are uniformly trash.

set a maximum of a 1 hour charge speed and pop up a notification that allows the user to manually elect to supercharge the battery faster, imo. it shouldn't be an automatic "the charger supports it, imma nuke the battery", that just sounds like vendors speeding up the pace of planned obsolescence.

people always complain about this with wireless, that the heat from a 5W wireless charger is somehow damaging the battery and causing e-waste as a result, and yet you've got vendors who are bragging about how they're zapping a phone battery with a 140W charger to get a third of a charge in 5 minutes or whatever, that's terrible as a general practice.


Apple gets this right. When I plug in my iPhone at night, regardless of the PD of the source it's connected to - it informs me that it's doing "Optimized Battery Charging" and will be fully charged by my wake up time. It slurps up whatever it can get if I plug in my phone any other time of the day - if I plug in my phone at 11% at 1pm, it probably means I missed my nightly charging cycle and I do really need the fast charge.


[flagged]


Would you please stop breaking the site guidelines? You've done it at least twice in this thread (https://news.ycombinator.com/item?id=32717674 is the other one I saw). We ban accounts that post like this, and we've had to warn you about this repeatedly already:

https://news.ycombinator.com/item?id=32280825 (July 2022)

https://news.ycombinator.com/item?id=21875253 (Dec 2019)

https://news.ycombinator.com/item?id=21143693 (Oct 2019)

Continuing like this will get you banned. I don't want to ban you, because you've also posted good things. Therefore if you'd please review https://news.ycombinator.com/newsguidelines.html and stick to the rules when posting here, we'd appreciate it.


>You've done it at least twice in this thread (https://news.ycombinator.com/item?id=32717674

Sorry, but what guideline did that comment break?


Surely you can phrase that a little more civilly?

I've worked with batteries for decades across various chemistries and yes, fast charging definitely does decrease the battery life. Maybe it's changed in the meantime, but your post isn't appropriate for this community.


Specifically, process improvements means that modern batteries aren't prone to dendrite formation under high rates of charge. Good thing too, because electric cars wouldn't be practical otherwise.


>You've missed the problematic point of where we are now.

I don't think I missed anything on the phone charging part, I think you're looking at other issues.

>If I had a USB port, and a cable that fit, it would charge my phone effectively. Somewhere in the transition to USB-C ,we lost that.

I don't know what you think we lost, but every type-C plug I could find in any household or office, always managed to charge my Android phone, just like micro-B before that.


> I don't know what you think we lost, but every type-C plug I could find in any household or office, always managed to charge my Android phone, just like micro-B before that.

if I show you a photo of two USB chargers, can you tell me which one will provide 5w to my phone and which will provide 15w?

Why do the two ports on my 45w USB adapter provide different wattages with no visible difference to them?

Why does my macbook have the same ports as my phone and iPad but not charge from them when using the almost identical charger to the one I'm supposed to use with it?

Why does my PS5 controller only charge to 2 out of 3 bars when using any/all of the adapters above?

> I think you're looking at other issues.

There are enough issues with where we've gotten to that didn't exist 5 years ago for me to be confident in saying we're going in the wrong direction. We've standardised _cables_, but the important part is the protocols, not the cable. I _want_ different cables for different protocols .


The output tends to be written on the USB charger, and beyond that lots of places label "power delivery" on special ports as well. You might have bought a badly designed USB adapter that doesn't label that, but that at least is fixable (hell, by you with a sticker!)

And of course you can choose to charge your macbook with an iPad charger, but (surprise I guess?) you won't get a lot of power because the thing is rated for lower rates.

But if we compare to the past.... everything USB-like was charging at 5W. I don't think you will find any USB-C cable + charger that doesn't do that. So things are _by definition_ not worse for that.

For your laptop.... you used to have a proprietary connector or a barrel connector. Now you have a USB-C connector. Now granted, the confusion around the cables and chargers is true! So... put a sticker on it. You now have a proprietary connector again, and things are the same as before.

You can pretend to be in the good old days, and have the nice surprise of added benefits when you need to be. This is a _mathematical proof_ that things are better if you use some colored tape.

PS: since you're only mentioning charging.... we did standardise the charging protocols. The cables are the same and the charging protocols are the same. Just if you buy a weaker charger it will output less.


>if I show you a photo of two USB chargers, can you tell me which one will provide 5w to my phone and which will provide 15w

Why? Did you also care about that with micro-B on Android? Or when using different wattage lightning chargers from Apple?

It's up to you to read the wattage on them an decide which one you want to use. Mobile tech has gotten more powerful and so have the chargers.

It's your responsibility to keep track of the chargers in your household, but the great part is, even if you don't and mix them up, they'll both charge your phone either way, just ar different speeds, and most likely any other low power type-C gadget in your household like your earbuds. You can't expect us to go back to having different plugs for different wattages just because you can't keep track of the different chargers you own. Devices and chargers are smart and they'll negociate the quickest and safest charging wattage regardless.

Since you're being obviously obtuse just to be snarky, I'll stop answering your questions as i think i provided enough arguments so far.


> If I had a USB port, and a cable that fit, it would charge my phone effectively. Somewhere in the transition to USB-C ,we lost that.

Nonsense. Your phone is not trying to pull 50W, any correctly implemented type C cable will do.

The complexity of Type C is for things you were not able to do at all: high-power applications (>40W) and / or high data rates over a single cable.

I've a single cable which charges my laptops, connects all the devices plugged into the display, and carries video to two different displays.

That does require a cable with somewhat high specs, and it's unfortunate that labelling isn't the clearest and unsuitable cables are difficult to diagnose, but before this was only an option via bespoke proprietary docks. Now it's just a standard cable.


I regularly charge my headphones off my laptop's 90W USB-C adapter. If your device overheats while charging it's not the fault of the charger it's a fault of the device. Don't buy crappy devices that will immolate themselves.


> Don't buy crappy devices that will immolate themselves.

It's the consumers fault that manufacturers don't follow the spec. It's my fault that Sony PS5 controllers only charge 2/3 of the way when not using the console as a power source, or the nintendo switch doesn't conform to the spec despite using the same adapters, or that my samsung buds aren't charging when I use an anker cable.


I have no idea about the PS5 controllers. I've never owned a PlayStation. I'd love to have more info on that, a quick search didn't yield me much information other than generic hardware issues people had. My Switch pro controller charges fine with 3rd party USB chargers, I've used several different ones and different cables without issue.

As for the Switch, it's somewhat sensitive to out of spec chargers and cables but if it's a proper cable or charger it shouldn't damage it. Most of the concerns were from 3rd party docks when it first came out which weren't really to spec. It would be nice if it wasn't so sensitive, but if you use good hardware you won't have issues.

I've got a regular Switch which has all the original hardware. I often use my laptop's 90W adapter to charge it when traveling. I have not experienced any issues in multiple years of doing so.

I guess I should add to that and say don't buy crappy devices that will immolate themselves or be out of spec so much they damage other components.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: