I suspect Five Eyes already has backdoors into Apple and Google and can remotely push compromised updates to specific targeted devices. How do you know the version of Signal you get from the App Store is the same as everyone else? Australia can compel engineers to implement backdoors, they can't refuse or even talk about it due to gag orders. They can then share that info with Five Eyes to bypass their own laws like they did with ANOM. Australia ran the operation specifically to bypass stronger privacy laws in the other countries. The Snowden leaks with PRISM, showing governments getting data straight from Apple and Google servers. Australia's latest encryption bill lets them force companies to give access to encrypted comms. Seems pretty plausible to me.
Don’t agree with this take. Sacrificing the performance for all my users so a minuscule percentage of them can poke around a little easier? All the JavaScript is likely transpiled anyway. They can use dev tools to unminify most of it. Deploying with source maps for production might be a better ask?
How big is your JS that minification has measurable performance impact? In my experience, minification doesn't add much (if anything) on top of gzip + cache for most applications.
Yeah, the way compression works, I wouldn't expect minification to reduce the transfer sizes to any significant degree, as it's basically layering a good compression algorithm on top of a bad one. In general compression algorithm matroshka dolls tend to be inferior in terms of size and speed when compared to just using one good algorithm.
All of the comments on this post that include numbers show that using both is better than just one or the other, mostly when the code’s sufficiently large.
I agree with the recommendation of one good compression algorithm vs combining multiple in general, but is it possible that you hadn’t considered that this is a case where we have a very good lossless compression (gzip) that can get benefits when combined with a lossy compression (minification)? As wonderful as Huffman codings can be, it’s hard to compete with outright throwing “unnecessary” (to many users, maybe not OP and some of us) data away.
Does still sort of does come off as majoring in the minors.
As illustrated by the examples in this thread, the actual benefit of this isn't particularly large, and it comes at the expense of worse developer experience. At the point where this would actually have a real measurable performance impact, your application is so hideously bloated there is without a shred of doubt other things you can do to improve the performance of your application that is more impactful than minification.
Yep. This is basically like saying "don't distributed compiled binaries"
I'm sure some people here will agree with that sentiment, but for the majority of users it's the better option. And if the goal of the project is for the source to be open then they can host that source somewhere for those who want to tinker, inspect, or build their own binaries.
At some point most web apps will probably be blobs of WASM anyway.
"Sacrificing the performance so a miniscule percentage of them can poke around a little easier?"
Wondering how to reconcile this with the never-ending didactic missives from software developers about writing code that others can easily understand. Generally, the affection for verbose languages and disdain for terse ones.
There is also a trust problem. How can we be sure that minification is only for the purpose of performance and not also for the purpose of obfuscation. For example, from the user's perspective, performance is routinely sacrificed to allow for advertising. Users must wait and allow their computer's resources and network bandwidth to be usurped for telemetry, ad auctions and ads they really do not want to look at. This is almost always orchestrated using Javascript.
My HR web service, used to check if colleagues are on paid leave, has a paginated view of an html table of paid leaves, 30 people per page. A single page contains about 2 million times the character "space", which in the vast majority of html rendering, is completely ignored (exceptions when they actually separate symbols and when they're present in raw text rendered as-is).
There so many more ways to improve performance without minifying code.
Do you have any strong example websites? The website you linked is very simple, I can’t see why most pages on it would benefit from using JS at all except for analytics or similar. It’s a good website, I don’t mean to make it sound otherwise, but it’s not the type that should consider minifying code in the first place.
Minification’s more useful for web applications, along the lines of: MS Office (e.g. Excel), Zappier, UberEats, Photoshop, the reporting side of analytics (e.g. Google Analytics) where you get all the interactive graphs and maps.
I think many of the disagreements on here would dissolve if they were only considering blogs and similarly small websites.
Self-reply: I see that the website links to tldraw under Job, which might be a good example to weigh the shipped size vs the same thing minified, except that it's already minified.
I actually really enjoy React Native with Expo. The over the air updates allowing me to bypass the app store approval process for minor changes are a game changer. React Native renders Apple's UIKit which looks and feels much better than Flutters attempt to emulate it. It's also much easier to make Android Apps look like an Android app with Material design while iOS apps look like their made for iOS. Flutter for web just draws everything in a canvas so it's not accessible and the resulting file size is massive. While with React Native I can leverage Solito to run the same code base as a Next.js application for the web.
Don't forget to keep an eye out for drop bears too! They're a particularly aggressive species of koala that are known to jump out of trees and attack unsuspecting humans.
it's definitely a big change in attitude from previous governments (conservative and otherwise) - i still remember Labour trying to introduce mandatory internet censorship a decade ago.
>The draft was open to public comment and 343 submissions were made.
>Of these, only one was in favor of the regulations. The rest either demanded revisions, or were completely against the bill. Despite the cascade of disapproval, the final draft of the law was barely subject to any scrutiny.
>On the last day of sitting before the Christmas period, a revised version of the bill was presented to the Parliament. It featured 173 amendments, but members were barely given any time to review them.
>The official reason was that the laws needed to be rushed through to prevent potential terrorist attacks from happening over the holidays. This was an especially dubious claim, since Australia already has a host of anti-terrorism laws.
>If the authorities required the new capabilities to force companies into building backdoors, then this also should be viewed with skepticism. Since the laws were passed on December 6, it is unlikely that any company would have been able to provide the necessary tools ahead of Christmas.
>On top of this, ASIO, the Australian spy agency, acknowledged there was no specific threat over the period, and it did not raise its warning level.
Absolute bullshit and our spineless "representatives" should be ashamed.
What kind of society are they supposedly defending?
This is really good, I wish I saw it before I built something similar. Mine is server-side rendered at the edge. It allows any value but still has no where near the functionality as yours. https://github.com/jmcmullen/qr-kit
I'd rather not have to provide my ID and verify my identity like on certain crypto exchanges to watch a youtube video or like a friends post on instagram.
I can't see any other implementation of this working, instead we'd have 2 popups now every time we visit a new site. One for cookies, the other to say "yes I'm over 18".
No, instead we could firewall the internet off behind a single-digit number of gatekeepers. Think Google and Facebook, with the engineers to build, execute, and secure a system like this. Think Apple and Amazon, who have credit card data to match a name to a digital identity.
This doesn't mean two popups, this means we forfeit the open and distributed nature of the internet in a final "Think of the children" moment.
Ungoogled Chromium. Chromium is open source, but still has most of the google stuff baked in. This project strips that google stuff out and that’s about it. I’ve been using the build available on home brew for a few years with no complaints.