Hacker Newsnew | past | comments | ask | show | jobs | submit | chromatin's commentslogin

I think his point is that slavery is not outlawed by the 13th amendment as most people assume (even the Google AI summary reads: "The 13th Amendment to the United States Constitution, ratified in 1865, officially abolished slavery and involuntary servitude in the United States.").

However, if you actually read it, the 13th amendment makes an explicit allowance for slavery (i.e. expressly allows it):

"Neither slavery nor involuntary servitude, *except as a punishment for crime whereof the party shall have been duly convicted*" (emphasis mine obviously since Markdown didn't exist in 1865)


402 Payment Required

https://developer.mozilla.org/en-US/docs/Web/HTTP/Reference/...

Sadly development along these lines has not progressed. Yes, Google Cloud and other services may return it and require some manual human intervention, but I'd love to see _automatic payment negotiation_.

I'm hopeful that instant-settlement options like Bitcoin Lightning payments could progress us past this.

https://docs.lightning.engineering/the-lightning-network/l40...

https://hackernoon.com/the-resurgence-of-http-402-in-the-age...


The author makes the following assertion:

    Let me illustrate a common code organization issue some programmers run into on their first day. The novice writes

    terminal.print("Hello world")

    Then they decide they want to make the text red, so they edit their program to

    
    terminal.print("Hello world")

    terminal.setPrintColor("red")
        
    And then they're confused that it didn't come out red. They haven't internalized that the first line of code happens before the second. They just get a soup of code on the screen that kind of has the ingredients for a program, and expect the computer to do what they want.

I find this _extremely_ surprising???


What do you find surprising---the fact that it doesn't come out red, or the fact that people are confused by this behaviour?


I actually want lightning for my USB-C phone :/


For what purpose? There are USB-C to Lightning adapters if you were using some specialized Lightning device you couldn't or didn't want to replace.


It is more satisfying to plug in a lightning cable. I know it sounds crazy. I can’t explain it.

I don’t care about charging speeds or data transfer speeds. When it is done, it’s done. Until then I will find something else to do or use it while charging.


That's not crazy at all. If you look at a male lightning connector, you can see detents at the sides that snap into a (spring-loaded) retaining mechanism on the female side. USB-C doesn't have anything like that which results in less tactile satisfaction.


USB-C uses an identical spring mechanism, actually, except the cord is the female side, and the port is male.


I don’t think this is correct? GP is talking about the indentations on the side of the male lightning connector, which get gripped from the side by the device. I don’t think the center tab on USB-C has those same features, nor do the cables have the grippy things.

I’d welcome correction. Certainly if those features are there they don’t feel the same. Lightning has a very satisfying snick.


If you look closely, you can see the springy retention clips in the plug. Below is a Stack Overflow answer with more details, it includes a link to the USB-C spec where you can also see the corresponding notches in the male part of the receptacle.

https://superuser.com/questions/1577898/how-does-the-retaini...

Also, some good USB-C cables have a very similar click to Lightning, including Apple's own USB-C cables. Lightning and USB-C are essentially the same design, except USB-C adds an automotive-style shroud around the male side.


Interesting. I stand corrected and thanks for the links! Seems I didn't look hard enough at my USB-C cables. Now I'm curious - like the person I responded to, I always felt that Lightning is more tactile and more consistent than USB-C (including Apple's own USB-C cables) and I wonder why that is. Maybe the Lightning spring is beefier or something?


Lightning plugs in with a pretty hefty thunk. While USB-C is a light click.


That depends on the cable. Apple's USB-C cables in particular have a very tactile non-squidgy click.


You just want what, magnetic connectorized charging?


I have magnetized adapters for most of my USB C devices. I've had a USB C port fail on a phone in the past.

They are very easy to use and have a satisfying snap when the cord connects.

My only issue with them is that we were recently at Great Sand Dunes National Park and my phone fell into the sand. The magnetic adapter was covered with sand (which wasn't too hard to clean) and very smart metallic bits that stick to the magnet. They were difficult and annoying to remove and prevented the adapter from connecting.

I guess on the plus side they protected the original port. I was able to remove the magnetic adapters and charge the phone with classic USB C.

I guess I do have two issues. The adapter on my MBP is very particular about the cable I use. And the adapter, that supports high speed data transfer and charging, appears to be directional. Although the plug seems to be symmetric, in practice it doesn't work on both sides.


Actually if they’d put a little magnet at the back of a USB (whatever type) port, that would be satisfying as heck. Like the computer is actively grabbing whatever you plugged in.


I fear magnets inside the connector would draw ferrous debris into the connector. I'd really rather that not happen!


Oh, that is a good point. Springs it is, then.


I love my USB-C iPhone but Lightning was smaller and easier to plug in.


From my experience using various (work provided) devices in outdoors agriculture use, I consider the lightning connector/port less prone to failure as well. If something was to break (from torque), it seems like the tab on the cable should snap or the cable just pull out before catastrophic damage to the port can occur.

Though I still had to replace cables because the cable itself developed a break somewhere, even with one that had proper stress relief at the ends.

Meanwhile most of the USB C ports on my Lenovo laptop from 2022 are barely working because somewhere along the line either the soldering broke or the port got too loose. Possibly from too much torque but I’m not sure. So the cable has to be at just the right angle. I’ve also done some android phone battery/screen replacement for friends, and had to do a few USB-C ports when it was possible due to the same sort of thing.

However all that is pretty much moot now, thanks to wireless charging and magnetic attachment docks. As such the only time I connect a cable anymore is monthly for cleaning out photos and other data. Previously I’d be connecting cables several times a day to charge in between fields as the battery went to shit. Honestly the “MagSafe” concept is the only change I’ve seen to smart phones in the past decade that I actually really like.


Lightning had small pins inside the port that could be caught by debris and pulled out of alignment (or in worst cases, broken off altogether). USB-C has no moving parts on the device side. Apple was reportedly behind that design since Lightning was nearing release when design for USB-C started (and Apple is/was a member of USBIF)


> Lightning had small pins inside the port that could be caught by debris and pulled out of alignment (or in worst cases, broken off altogether).

Lightning has 1.5mm of height in the slot, debris has to be pretty large to get stuck and usually it's enough to just blow some compressed air into the slot to get dirt to release.

In contrast, USB-C has only 0.7mm between the tab and the respective "other" side, so debris can get trapped much much more easily, and the tab is often very flimsy, in addition to virtually everyone sans Apple not supporting the connector housing properly with the main device housing.


Does anyone have reliability data for USB-C ports? It seems to me like Lightning is more robust to repeated plug/unplug cycles. But this is only on my limited sample size of one laptop with a failed USB-C port and some vague hand waving.


It shouldn't be, my understanding is that the springy bits (the most likely wear part) in Lightning are in the port, whereas in USB-C they're intentionally in the cable so you can replace it. I'm surprised you have a failed USB port, but I've never experienced one fortunately.

I see Lightning as fragile on both sides of the connection, since the port has springy bits that can wear, and the cables also die, either due to the DRM chips Apple involves in the mix for profit reasons, or due to the pins becoming damaged (perhaps this? https://ioshacker.com/iphone/why-the-fourth-pin-on-your-ligh... ).


USB-C has an unsupported tab in the middle of the port. It's pretty easy for that tab to bend or break, especially if the plug is inserted at an angle.

Lightning doesn't have that failure mode. Also Lightning ports only use 8 pins (except on the early iPad Pros), so reversing the cable can often overcome issues with corroded contacts. That workaround isn't possible with USB-C.


I've never seen a device with a broken tab. One thing people seem to misunderstand grossly to keep regurgitating these claims is that there are thousands of USB-C ports from different manufacturers and price points. The Lightning connector is strictly quality controlled by Apple. The USB-C in your juul isn't the same as the one in a high-end device.

The tab in the USB-C port makes the port more durable since it moves the sensitive springy parts to the cable(s) which are easily replaced.

Quality control matters, Apple is arguably quite good at it. USB-C is more wild-west so if you're prone to buying cheap crap you'll be worse off.


Reversing works around some broken conditions for usb-c, power and usb 2.0 data are on both sides. Depending on how bad the corrosion is, reversing may help.

Usb 3 might be trickier, but then iPhone lightning doesn't have that anyway.


Baseline USB 3 is also single sided. Only some of the extra fast modes use both sides.


The springy bits never wear out anyway. I've never once seen an iphone that couldn't grip the cable unless the port was full of pocket lint. Main problem I see is USB-C has both a cable and port which are hard to clean.


The springy bits get torqued weirdly by debris and can be bent out of alignment and/or into contact with each other. It’s rare, but it happens. And the whole port needs to be replaced, which usually means the whole device.


The white plastic toothpick found on most Swiss Army knives is perfect for cleaning USB-C ports.


The Lightning port itself might be more reliable, problem is Apple Lightning cables always break, and all third-party ones (even MFi) are prone to randomly not working after a while. I'd be perfectly fine with Lightning if it were an open spec, instead it singlehandedly created the meme of iPhones always being on 1% battery.


The Lightning connector is superior for everyday use. It's exeptionally reliable, tolerant of debris, and difficult to damage. It was designed to last, unlike every single USB device port ever made, which was designed to fail so you'd need to replace the cable and device eventually. MiniUSB, MicroUSB, and USB-C. It's all trash.

Lightning has a perfect mechanical design. The pins phone-side are nearly possible to damage because they're well supported and only poke out in a bump shape that can't hook on anything. The cable side is the same way - no pins to catch on anything. The port is easy to clean out. The cable end is trivial to clean. The retention mechanism doesn't rely on anything that can wear out or break.

Meanwhile the USB-C connector puts a fucking thin wedge of plastic in the middle of the connector and even worse, there are pins around that center thin wedge and they're easily broken/damaged because they have no protection whatsoever and poor mechanical support. Oh, and the retention mechanism sucks just like it has in every

The USB-C port on my airpods is contactly getting fucked up while once in a blue moon I need to tick a toothpick in and rummage around a little to get some lint out of my phone's Lightning plug, and it's good for a couple more months...and that thing lives in my pocket, whereas the Airpod case spends most of its life sitting around on tables.

It's also a substantial plus that Apple tightly controls the cable spec. Just go look at the pages where people document USB-C cables that are so shitty they'll destroy the electronics in one or both devices.


because you still need a cable with a lightning end in your spaghetti of cables in a drawer somewhere. if all of your devices had USBC on both ends, then you don't need the one cable with the special adapter. you just need USBC cables. this isn't rocket science, and it's not a hard position to be sympathetic with either.


Having everything be usbc makes sense.

Having everything be lightning makes sense too, but is infeasible. Lightning was never going to be good for almost all devices, like usb mini-b, micro-b, and now usb-c have managed to get to.


Adapter might not fit, or is annoying even if it does


Could be interesting, but I didn't see any /PICTURES OR VIDEOS/ of what I could create with this product.


Do these exercises go with a book?

The only (obvious) option is to begin solving problems.

If someone does not already _know_ OCaml, I fail to see how this is a way to learn.

A better title might be "Practice OCaml"


"When a measure becomes a target, it ceases to be a good measure".

https://en.wikipedia.org/wiki/Goodhart%27s_law

BRB, changing the simulated latency in my bot.


Agreed. Section 3 takes the idea to the extreme -- can a bot replicate human cognition? Traditional OCR CAPTCHAs were a good 'measure' that couldn't be fully gamed. That is, while the rise of computer vision made them eventually ineffective, the gains in computer vision did not come from bot farms


The article massively undersells the information content of the genome in several key ways. A non-comprehensive list of these (before my morning coffee forgive me) includes:

- DNA methylation (https://en.wikipedia.org/wiki/DNA_methylation)

- Interactions of alleles (what article refers to as the "two versions of each base pair")

- Duplications, deletions, inversions, and other structural variations (https://www.genome.gov/genetics-glossary/Structural-Variatio...)

- Physical proximity interactions in 3-dimensional space (https://cmbl.biomedcentral.com/articles/10.1186/s11658-023-0...)

- Combinatorial effect (massive) of different alleles in complex systems

Overall, it's not sensible to compare a linear sequence of bits, like a CD (sibling comment) or DVD (the article), to the linear sequence of the genome and conclude that their information content, based on length alone, is in any way comparable.


Exactly. The compression level of DNA is magnitudes better than anything we can even come close to. DNA usually doesn't even contain specific counts (like 5 fingers on hand) or sizes of organs and so on - these are given by the processes that run in parallel and cause the cells to hit spatial / chemical / electrical or other limits. It's like putting lots of house builders on specific places where the house should be and each one would just keep building a wall until the he hits another one. There is no compressed house plan, it's a compressed "engine" that builds the result.


Comparing it to machine code on CD/DVD might make more sense then. Machine code where every line has been hand-optimised by nature's hackers over 500 million years.

And in that context, hundreds of MBs is a heck of a lot of complexity.


You put my reaction to this in much more educated terms. I’ve always felt that thinking of DNA as bits was a bit simplistic. Just because we store information as bits it doesn’t mean that nature does.

Not that it means they can’t be right, but the author also doesn’t seem to have any particular expertise in genetics. Their ideas need to survive a lot more criticism by people who know what they’re talking about before you could start to see them as convincing.


T he raw bits of the base pairs is just one component of the information, but it’s like a maximally compressed version of the info.

The laws of physics are another component.

From there you would need to simulate nature to be able to decompress all the data, like how computer programs can use procedural generation.

Imagine a game like Minecraft. You can generate practically infinitely many screenshots of Minecraft worlds, but all that data can be derived from the game code and the jvm.


> T he raw bits of the base pairs is just one component of the information, but it’s like a maximally compressed version of the info.

This sounds a bit suspect. Maximally compressed version would be very sensitive to mutations which wouldn't be great for adaptation via mutations. My understanding is that only a small fraction of mutations lead to unviable phenotypes.

Also AFAIK the current understanding is that majority of DNA is "junk", i.e don't seem to affect the phenotype. Which would be a partial explanation for the above.

The process of genetic expression is indeed something like procedual generation, but if maximal compression is about something like Kolmorogov complexity, the produced phenotype doesn't contain more information than the genetic information.


He does mention structual interactions as well as duplications/deletions/inversions. I would argue methylation is more like an annotation of DNA and not part of the DNA itself, but that's a matter of opinion.

In the end, the author literally says: "nobody knows". Yes, you cannot compare a linear sequence of bits to a macromolecule that interacts structurally with its environment, and the author does not make that claim. The question he tries to answer is: how much data is needed to re-create a similar macromolecule that interacts in a similar way. His main point, in which you both agree: only the exons are surely not enough because the encoded proteins are just a (small?) part of how DNA interacts.


Exons are almost like functions where as a gene is almost like a class definition. In different tissues in the body a gene might be alternatively spliced to lead to different protein isoforms. In effect, making use of only a subset of available functions in the class depending on certain input parameters or how the class is called.


This is a Star Trek version of the subject, in that it is pure technobabble which happens to mention a few real terms.


As someone with both a biochem and CS background, I found the comment insightful and clear. Zero technobabble to my ears.


What does casting biochem in the metaphor of CS abstractions, in this example, clarify? What does it elucidate? What further predictions does it allow us to make about either subject of the metaphor? Can those predictions be tested? Do they make sense enough for that to be a meaningful question?

Show me how this isn't a more confusing than useful explanation, even for the bright ten-year-old or so at whose level it appears to be pitched, and I'll grant it may have some value.


I find that even if this just provides a lower bound it is still an interesting piece of information.


Yeah...

We know now that environmental factors change how DNA is expressed as well through epigenetics.

I don't know how any of it works. Something to do with the shape the DNA when it is wound up and how it changes the output when RNA produces proteins.

This is how parents can do things like pass some of the athleticism they earn through training to their children. It is possible for athletic parents to pass genes in such a way that it produces children even more athletic then they were.

All of this means that DNA has the ability to encode information and produce proteins in different ways using the same sequences.

So I am guessing that a lot of the DNA that is considered "junk" may not actually be. They are just missing a piece of the puzzle in how it gets read in.


But all of those emergent effects are accounted for in the DNA sequence [1], so the estimate is fine.

1. Maaaaybe you could make a case for DNA methylation, but that still requires some DNA signatures so ...


> There's less than 80 minutes worth of music's worth of information in our genomes

What an insanely bad take.

Not only did you not read and/or comprehend the article, the article itself undersells the information content of the genome (I'll post on this at the top level).

> You are not predisposed to be anything.

This does not logically follow your preceding statement, even if we were to accept the foregoing limited information content as factual


no matter how much information you think is stored in DNA, the information stored in your brain is at least 5 orders of magnitude more. In comparison with what you have learned, your DNA is a rounding error.

What you can learn will just swamp any predispositions you have.


Looks incredible (Firefox)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: