This seems... not terrible? The typical counter-argument to any "think of the children!" hand-wringing is that parents should instead install parental controls or generally monitor what their own kids are up to. Having a standardized way to actually do that, without getting into the weirdness of third-party content controls (which are themselves a privacy/security nightmare), is not an awful idea. It's also limited to installed applications, so doesn't break the web.
This is basically just going to require all smartphones to have a "don't let this device download rated-M apps" mode. There's no actual data being provided - and the bill explicitly says so; it just wants a box to enter a birth date or age, not link it to an actual ID. I'm not clear on how you stop the kid from just flipping the switch back to the other mode; maybe the big manufacturers would have a lock such that changing the user's birthdate when they're a minor requires approval from a parent's linked account?
That said, on things like this I'm never certain whether to consider it a win that a reasonable step was taken instead of an extreme step, or to be worried that it's the first toe in the door that will lead to insanity.
The language suggests to me that GitHub would be a covered app store and a FOSS Linux distribution without an age gate API would be illegal in California (along with all programs that don't check the age API, e.g. `grep`), so it seems quite a bit worse in terms of killing free speech and culture than requiring adult sites to check id to me.
Notably, a "covered app store" doesn't seem to need to be... a store. Any website or application that allows users to download software is covered. There's no exemption for non-commercial activity. So every FOSS repo and programs like apt are covered? The requirement is also that developers will request the signal. No scoping to developers that have a reason to care? So vim is covered? Sort? Uniq?
Honestly I can't believe big tech would go along with it. Most of their infrastructure seems like it would clearly be illegal under this bill. Either there's something extremely obvious I'm missing or every lawyer looking at this bill is completely asleep at the wheel.
I hadn't thought about GitHub -I'm guessing the authors of the bill didn't either - but you're right, that is somewhat concerning. Still, I don't think it's the end of the world...
> The requirement is also that developers will request the signal. No scoping to developers that have a reason to care?
I don't see that requirement. Here's the sum total of the developer's responsibilities (emphasis added):
> A developer with actual knowledge that a user is a child via receipt of a signal regarding a user’s age shall, to the extent technically feasible, provide readily available features for parents to support a child user with respect to the child user’s use of the service and as appropriate given the risks that arise from use of the application, including features to do all of the following:
> (A) Help manage which accounts are affirmatively linked to the user under 18 years of age.
> (B) Manage the delivery of age-appropriate content.
> (C) Limit the amount of time that the user who is 18 years of age spends daily on application.
It would be nice if it had specific carve outs for things that aren't expected to interact with this system, but it seems like they're leaving it up to court judgment instead, with just enough wiggle room in the phrasing to make that possible.
If your application doesn't have a concept of "accounts", then A is obviously moot. If you don't deliver age-inappropriate content, then B is moot. The only thing that can matter is C, but I'd expect that (a) nobody is going to complain about the amount of time their kids are spending on Vim and (b) the OS would just provide that control at a higher level.
> (b) (1) A developer shall request a signal with respect to a particular user from an operating system provider or a covered application store when the application is downloaded and launched.
> (b) If an application last updated with updates on or after January 1, 2026, was downloaded to a device before January 1, 2027, and the developer has not requested a signal with respect to the user of the device on which the application was downloaded, the developer shall request a signal from a covered application store with respect to that user before July 1, 2027.
Application developers are required to request an age signal from the operating system.
> (c) “Application” means a software application that may be run or directed by a user on a computer, a mobile device, or any other general purpose computing device that can access a covered application store or download an application.
So applications are any program that runs on any computer with the capability to install software from an online source. Ergo, a program like `sort` must request an age signal when it runs.
The bill is clearly thinking in terms of the two big phone monopolists while ignoring computers that are meant to act as useful tools (which contain thousands of programs from tens of thousands of authors and which have no business caring about what "store" they came from or what date they were installed on or anything about the user running them), but it explicitly says it applies to general purpose computers.
A massive improvement would be to say this only applies when there is an actual commercial store involved, and only place requirements on developers to do something if they would have some other requirement they need to comply with. And also realize that lots of applications are not meant to have the user there the whole time. How are batch jobs or interpreters like `python ` that you might leave running overnight on a job supposed to deal with the time limit? This bill is entirely focused on toys at the expense of computers. It should just place requirements on the actual companies that are causing the issues (social media/adtech, porn, gambling games, etc.)
Your store doesn't distribute social media apps, gambling, porn, etc. and just has things like text editors, music players, PDF readers, etc? No requirements should be needed. You develop a workout tracker? No requirement should be needed.
It's always possible that they'll say it, but it would be a lie based on my reading of this bill. Sideloaded apps can choose whether or not to respect the OS's advice about the age of the user, it's not on the OS or device to enforce them being honest.
Yeah... I'm far from an expert on state-of-the-art ML, but it feels like a new embedding would invalidate any of the layers you keep. Taking off a late layer makes sense to me, like in cases where you want to use an LLM with a different kind of output head for scoring or something like that, because the basic "understanding" layers are still happening in the same numerical space - they're still producing the same "concepts", that are just used in a different way, like applying a different algorithm to the same data structure. But if you have a brand new embedding, then you're taking the bottom layer off. Everything else is based on those dimensions. I suppose it's possible that this "just works", in that there's enough language-agnostic structure in the intermediate layers that the model can sort of self-heal over the initial embeddings... but that intuitively seems kind of incredible to me. A transformation over vectors from a completely different basis space feels vanishingly unlikely to do anything useful. And doubly so given that we're talking about a low-resource language, which might be more likely to have unusual grammatical or linguistic quirks which self-attention may not know how to handle.
The current holder of that domain is using it to host a single page that pushes anti-vax nonsense under the guise of fighting censorship... but also links to the actual PuTTY site. Very weird mix of maybe-well-meaning and nonsense.
The guy behind that page and bitvise appears to have gone totally crazy during the pandemic. On his blog, he said in 2021 "I forecast that 2/3 of those who accept Covid vaccines are going to die by January 1, 2025."
And in 2022, he wrote "Covid-19 is mostly snake venom added to drinking water in selected locations. There may also be a virus, but the main vehicle of hospitalizations is boatloads of powder, mixed in during 'water treatment.' Remdesivir, the main treatment for Covid, is injected snake venom. mRNA vaccines hijack your body to make more snake venom."
> mixed in during 'water treatment.' Remdesivir, the main treatment for Covid, is injected snake venom. mRNA vaccines hijack your body to make more snake ven
Whaaaaat the fuuuuuuck
Can anyone debug this statement?? I’m not looped into weird this realm of paranoid delusion torecognizs what they’re referring to here.
> MCP promises to standardize AI-tool interactions as the “USB-C for AI.”
Ironically, it's achieved this - but that's an indictment of USB-C, not an accomplishment of MCP. Just like USB-C, MCP is a nigh-universal connector with very poorly enforced standards for what actually goes across it. MCP's inconsistent JSON parsing and lack of protocol standardization is closely analogous to USB-C's proliferation of cable types (https://en.wikipedia.org/wiki/USB-C#Cable_types); the superficial interoperability is a very leaky abstraction over a much more complicated reality, which IMO is worse than just having explicitly different APIs/protocols.
I'd like to add that the culmination of USB-C failure was Apple's removal of USB-A ports from the latest M4 Mac mini, where an identical port on the exact same device, now has vastly different capabilities, opaque to the final user of the system months past the initial hype on the release date.
Previously, you could reasonably expect a USB-C on a desktop/laptop of an Apple Silicon device, to be USB4 40Gbps Thunderbolt, capable of anything and everything you may want to use it for.
Now, some of them are USB3 10Gbps. Which ones? Gotta look at the specs or tiny icons, I guess?
Apple could have chosen to have the self-documenting USB-A ports to signify the 10Gbps limitation of some of these ports (conveniently, USB-A is limited to exactly 10Gbps, making it perfect for the use-case of having a few extra "low-speed" ports at very little manufacturing cost), but instead, they've decided to further dilute the USB-C brand. Pure innovation!
With the end user likely still having to use a USB-C to USB-A adapters anyways, because the majority of thumb drives, keyboards and mice, still require a USB-A port — even the USB-C ones that use USB-C on the kb/mice itself. (But, of course, that's all irrelevant because you can always spend 2x+ as much for a USB-C version of any of these devices, and the fact that the USB-C variants are less common or inferior to USB-A, is of course irrelevant when hype and fanaticism are more important than utility and usability.)
As far as I know, please correct me if I'm wrong, the USB spec does not allow USB-C to C cables at all. The host side must always be type A. This avoids issues like your cellphone power supplying not just your headphones but also your laptop.
No, you're thinking about USB-A to USB-A, which is definitely prohibited by the spec. (Whereas USB-C to USB-C cables are most certainly not disallowed.)
What's disallowed is for a non-host to have USB-A, hence, USB-A to USB-A is impossible, because one side of the cable has to be connected to a "device" that's not acting in host mode.
Only the host is allowed to have USB-A.
This is exactly why USB-A is superior to USB-C for host-only ports on embedded devices like routers (as well as auxiliary USB ports on your desktop or monitor).
Generally, many modern travel routers have one USB-C and one USB-A port. Without any documentation or pictograms, you can be relatively sure that USB-A would be used for data, and USB-C is for power (hopefully, through USB-PD). Since USB-A couldn't possibly be used to power up the router, since USB-A is a host-only port.
USB-C is great for USB-OTG and the bidirectional modes, when the same port could be used for both the host and the peripheral device functions, like on the smartphones. https://en.wikipedia.org/wiki/USB_On-The-Go
If the port can ONLY be used in host-mode, and does NOT support Alt Mode, Thunderbolt, or bidirectional USB-PD, then USB-A is a far more fitting connector, to signify all of the above.
Uptime and reliability are not the same thing. Designing a bridge doesn't require that the engineer be working 99.9% of minutes in a day, but it does require that they be right in 99.9% of the decisions they make.
Your first example has to do with the fact that tuples are copied by value, whereas lists are "copied" by reference. This is a special case of an even larger (IMO) misfeature, which is that the language tries very, very hard to hide the concept of a pointer from you. This is a rampant problem in memory-managed languages; Java has similar weirdness (although it's at least a bit more consistent since there are fewer primitives), and Go is doubly odd because it does have a user-controllable value vs. pointer distinction but then hides it in a lot of cases (with the . operator working through pointers, and anything to do with interfaces).
I think the whole thing does a misservice to novice or unwary programmers. It's supposed to be easier to use because you "don't have to worry about it" - but you really, really do. If you're not familiar with most of these details, it's way too easy to wander into code that behaves incorrectly.
> This is a special case of an even larger (IMO) misfeature, which is that the language tries very, very hard to hide the concept of a pointer from you.
When I came to Python from Perl, it only took me about one day of Python programming to realize that Python does not have references the same way that Perl does. This is not flame bait. Example early questions that I had: (1) How do create a reference to a string to pass to a function? (2) How do I create a reference to reference? In the end, I settled on using list of size one to accomplish the same. I use a similar trick in Java, but an array of size one. In hindsight, it is probably much easier for junior programmers to understand the vale and type system in Python compared to Perl. (Don't even get me started about the readability of Perl.) Does anyone still remember the 'bless' keyword in Perl to create a class? That was utterly bizarre to me coming from C++!
> Your first example has to do with the fact that tuples are copied by value, whereas lists are "copied" by reference.
My mental model for Python is that everything is '"copied" by reference', but that some things are immutable and others are mutable.
I believe that's equivalent to immutable objects being 'copied by value' and mutable ones being '"copied" by reference', but "everything is by reference" more accurately reflects the language's implementation.
Yeah, I know that's how it works under the hood - and why you have things like all integers with values in [-5, 256] being assigned to the pre-allocated objects - but I don't think it's a particularly useful model for actually programming. "Pass-by-reference with copy-on-write" is semantically indistinguishable from "pass-by-value".
There is no copy on write and no pass by reference.
Python is "pass by value", according to the original, pedantic sense of the term, but the values themselves have reference semantics (something that was apparently not contemplated by the people coming up with such terminology — even though Lisp also works that way). Every kind of object in Python is passed the same way. But a better term is "pass by assignment": passing a parameter to an argument works the same way as assigning a value to a variable. And the semantic distinctions you describe as nonexistent are in fact easy to demonstrate.
The model is easy to explain, and common in modern programming languages. It is the same as non-primitive types in Java (Java arrays also have these reference semantics, even for primitive element types, but they also have other oddities that arguably put them in a third category), or class instances (as opposed to struct instances, which have value semantics) in C# (although C# also allows both of these things to be passed by reference).
The pre-allocated integer objects are a performance optimization, nothing to do with semantics.
The model is useful for programming, because it's correct. We know that Python does not pass by reference because you cannot affect a caller's local variable, and thus cannot write a "swap" function. We know that Python copies the references around, rather than cloning objects, because you still can modify the object named by a caller's local variable. We know that no copy-on-write occurs because we can trivially set up examples that share objects (including common gotchas like https://stackoverflow.com/questions/240178).
> I don't think it's a particularly useful model for actually programming
I think “everything is by reference” is a better model for programming than “you need to learn which objects are by reference and which are by value”. As you say, the latter is the case in Go, and it’s one of the few ways the language is more complex than Python.
You could argue that in Python you still have to learn which objects are mutable and which are immutable - but if it weren’t for bad design like `+=` that wouldn’t be necessary. A object would be mutable if-and-only-if it supported mutating methods.
> I think “everything is by reference” is a better model for programming than “you need to learn which objects are by reference and which are by value”.
The model is: everything is a reference (more accurately, has reference semantics); and is passed by assignment ("value", in older, cruder terms).
> but if it weren’t for bad design like `+=` that wouldn’t be necessary. A object would be mutable if-and-only-if it supported mutating methods.
SF city and county are actually the same legal entity, not just the same land. It's officially called the City and County of San Francisco, and it's just as unusual as it sounds. The mayor also has the powers of a county executive with both a sheriff's department (county police to run the jails) and police department (city law enforcement) reporting to him; the city government runs elections like other counties; the Board of Supervisors - which is the typical county legislative structure - also serves as city council. (Denver, Colorado works the same way, I think.)
I don't think that's the point. If non-technical people are able to make a product happen by asking a machine to do it for them, that's fine. But they're not engineering. It simply means that engineering is no longer required to make such a product. Engineering is the act of solving problems. If there are no problems to solve, then maybe you've brought about the product, but you haven't "engineered" it.
I don't think that memorizing arcane Linux CLI invocations is "engineering" either, to be clear.
If you hired people to build that product, you never wrote a line of code. No, you didn’t build it. Your team did. You’re not magically a software engineer, you hired someone else to do it.
Is there a product? Yep. Do you own it? Maybe. But again, you’re not suddenly the engineer. A project manager? Maybe.
That's why I used the word create. I would be responsible for the creation of the product, so imo I created it. I'm the creator. It wouldn't exist without my vision, direction, and investment (of time and/or money).
Like a movie Producer: they don't actually "build" the movie. They use their money pay people to manifest a movie, and at the end of it they have created a movie and get a share of the profits (or losses) that come with it.
No, they shouldn't call themselves cinematographers, but they can say that they "produced" the movie and nobody takes issue with that.
> Do you own it? Maybe.
If I paid for it then absolutely I own it. I get to keep the future profits because I took the risk. The people that "built" it get nothing more than what I paid them for their labor (unless I offerred them ownership shares).
i think people are trying to make this difficult when it’s honestly super simple.
yes, you can make a product. no, it does not suddenly magically make you a musician.
you did the equivalent of hiring someone else to do it. you did not do it.
if you claim you wrote the novel, you’re lying. someone else did. if someone takes credit for work someone else did, they’re lying. it’s honestly not complicated. at all.
you're not countering what i'm saying, so i think we agree.
i'm just adding that (as an "engineer") i don't care what you call me, or what i call myself, because nobody cares and it doesn't matter. i'm commodified labor. replacable. with no claim on anything. and nobody will ever agree on the correct title anyway.
yeah, it sounds like we both agree with the original post.
it literally doesn’t make someone an engineer.
its not difficult to understand but for some reason when its said it pisses certain people off.
i suspect many of the people upset want to convince themselves they’re suddenly magically a musician, architect, engineer, novelist, programmer, etc… when it just couldn’t be further from the truth. they’re just doing the equivalent of sending a dm to a coder friend and the friend is the actual programmer.
i think some people don’t appreciate being told the truth.
It's not just time-value. It's also not just tying/advertising (although it is some of that - if I'm getting a ton of "free" points to American, I'm more likely to fly with them). It's both of those, and so much more.
Loyalty points work like gift cards in that huge numbers of them go unredeemed for any value, so selling them is just printing money. And unlike gift cards, which are typically denominated in currency, airline points don't have a fixed exchange rate to USD, so the airline can sell them to Chase or whatever for $0.01, and then if it needs to rebalance the books to shed the outstanding liability it can easily adjust the point costs of flights to make them only worth $0.009 - it's the same as a price hike, but in a way that's less noticeable to most customers most of the time. And that's assuming they don't just sell the points at an outright profit to begin with.
You can find a number of analyses showing that airlines operate at a loss if you set aside the miles-economy revenue streams. United famously got a line of credit secured against their loyalty program in 2020, in which they and their creditors valued the loyalty program at more than the value of the entire company of United Airlines - which would naively imply that the actual airline, the part of the company that owns large expensive machines and actually sells a product to consumers, had negative value.
The tradeoff on short domestic flights is that it encourages more - and larger - carry-ons, which slows down boarding/deplaning and therefore adds to turnaround time. If I don't have to pay for checked bags, I'd often prefer to have mine checked, especially if I have a connection - but since I do, I'll squeeze everything into a carry-on roller bag instead. Personally, it only takes me an extra second or two, but when you have a whole family doing this and only parent who can actually reach the overhead bins, it bogs down the whole aisle.
This is why I love it when airlines charge for carry-on bags, like spirit does. Everyone just has a teeny little backpack. Getting on and off is a breeze.
This seems... not terrible? The typical counter-argument to any "think of the children!" hand-wringing is that parents should instead install parental controls or generally monitor what their own kids are up to. Having a standardized way to actually do that, without getting into the weirdness of third-party content controls (which are themselves a privacy/security nightmare), is not an awful idea. It's also limited to installed applications, so doesn't break the web.
This is basically just going to require all smartphones to have a "don't let this device download rated-M apps" mode. There's no actual data being provided - and the bill explicitly says so; it just wants a box to enter a birth date or age, not link it to an actual ID. I'm not clear on how you stop the kid from just flipping the switch back to the other mode; maybe the big manufacturers would have a lock such that changing the user's birthdate when they're a minor requires approval from a parent's linked account?
That said, on things like this I'm never certain whether to consider it a win that a reasonable step was taken instead of an extreme step, or to be worried that it's the first toe in the door that will lead to insanity.