The flag is -Wformat-nonliteral or -Wformat=2. -Wformat-security only includes a weaker variant that will warn if you pass a variable and no arguments to printf.
5 years seems like a low for Apple device longevity. I think that there’s an enormous difference between today’s Macs and Macs from 5 years ago and maybe the leap is accelerate deprecation, but there was a long period of stagnating macOS requirements. Hope we return to that for Apple Silicon Macs.
It depends when you buy vs. when it was first released. The 2014 Mac Mini for example was sold until late 2018 so it's perfectly possible for someone to have bought it Sept. 2018 which means only 3 years of new macOS versions as it didn't get Ventura. I think the model with the best supported OS track record from Apple is still the Mac Plus from 1986 which was supported all the way through 7.5.5 in 1996.
> We could stop right here but this suffers from overflow limitations if translated into C.
FWIW, the final version also suffers from integer overflow limitations. If the difference between an and INT_MIN/INT_MAX (depending on whether you floor or ceil) is <= b/2, you will have integer overflow.
In context, it sounds like they relied on simulations that don’t use exact numbers. I’m guessing that they saw an IEEE-754 floating-point infinity and then had to determine whether they got it because the accurate result was infinity or if the infinity they saw was the result of floating-point calculation artifacts.
I feel that it’s so simple that many people will overlook that it even exists. In languages that have both, it’s hard for functions to compete with operators. I don’t think that this is the best design to promote correctness.
Maybe, but providing simple functions for "obvious" operations, to promote correctness, make it easier for the compiler, or just for convenience is not uncommon at all. Most languages have a min/max function somewhere, sometimes built-in, sometimes in the standard library, even though it is trivial to implement. C is a notable exception, and it is a problem because, you have a lot of ad-hoc solutions, all with their own issues.
If you look at GLSL, it has many function that do obvious things, like exp2(x) that does the same thing as pow(2,x), and I don't think anyone has any issue with that. It even has a specific "fma" operation (fma(a,b,c) = a*b+c, precisely), that solves a similar kind of problem as the overflowing average.
The “battle of preparedness” for grandmasters looks miserable. My experience being bad at chess is pretty cool, though.
If anyone’s looking to pick up chess, it’s a pretty good time to do it even if the people making a living out if it aren’t enjoying it anymore. It used to be that computers would only crush you without helping, but now they’re able to point at your mistakes and show you where the game shifted from one player to the other.
Yes, but your 1995 Windows NT 4.0 PC ran a 640x480 display at 60Hz and graphics compositing had, at best, one-bit transparency. It took 3 minutes to boot. Websites could bluescreen it with `<img src="con">`. A malicious attachment could trick your email client into deleting your whole hard drive.
I think the interesting question is what how much we're "paying" computationally for each of those things.
The "img src=con", and to a lesser extent, the "malicious attachment" thing could be solved on the same PC by running something not-Windows. 1995 might have been a bit in the teething era for Linux and BSDs but maybe some commercial Unix would have been viable?
The "3 minutes to boot" would be largely ameliorated by using a SSD and by the fact the phone is largely a fixed hardware tree that doesn't require significant probing and dynamically selecting drivers at boot time.
Getting to a higher resolution and colour depth-- okay, maybe you need to advance to say the specs of a decent 2005 PC (1Gb memory, early x86-64 CPU, DirectX 9 class GPU) to get there.
But beyond that, I think we're paying mostly for poor software design. How many apps are loading big full-featured browser engines when all they need is libcurl and some minimal optimized rich-text system? How many apps are relying on dynamically loading content that could have been permanently baked into the bundle (i. e. a shopping app's category tree?)
Yes we're all collectively paying for poor software design by having to compensate with excellent hardware. That should tell you that "good" software is really hard to make but "good enough" software is not if you have sufficient hardware.
If you think that's bad or it makes you think that most developers can't write good software, that's the price we have to pay for innovation. As an analogy, it'd be great if manufacturing was always priced at mass manufacturing injection molded costs, but someone has to 3D print and hand assemble prototypes to prove that the idea is possible and worth making. In software, we just stop at the prototyping stage and make that the product hoping eventually we have enough time and money to redesign for mass manufacturing scale.
My NT machine ran 1280x1024 which is about 2/3 of what my phone does. More Hz is the job of the GPU and doesn't take more memory. And I'm pretty sure it took under a minute, and the boot speed would compete very well with my phone if the drive was migrated to a $25 SSD.
The security was different, but I don't think that's one of the major factors here.
This is fairly old. When it came up last time, there were robust arguments that xz was characterized unfairly, and that the author’s format wasn’t very good at recovering in most cases either.