Hacker News new | past | comments | ask | show | jobs | submit login

This is so incredibly Apple :)

The breakage, I mean. To clarify a bit, for better or for worse, this is what Microsoft does, totally different psychology: https://blogs.msdn.microsoft.com/oldnewthing/20031223-00/?p=...




So Unix people had this function called `gets` that was defined like this:

    char *gets(char *at);
In the early days, if someone wanted a string you could do:

    x = gets(sbrk(0));brk(x + strlen(x) + 1);
And this is perfectly safe, but it is perhaps the only safe way to use `gets`. See, most people wanted to write:

    char buf[99];
    x = gets(buf);
And is this not safe because `gets` doesn't know that `buf` only has room for 99 bytes.

The API has a choice:

a) They can make it harder to do the wrong thing; make `gets` generate errors, warnings, crash if you use it, etc. This requires people fix their programs. That's what GNU did for `gets` and it's what Apple is doing here.

b) They can change the rules: It's possible to modify the ABI and require markers or a mechanism to detect where the edge of the buffer is. This requires people recompile their programs. I think Zeta-C had "fat pointers" so gets should be safe there[1]

c) They can work around: If you have enough resources, you can look at the programs using this API and figure out what their buffer sizes are and hardcode that into the API provider. Microsoft has famously done this for SimCity[2]

That's it. They can't really do anything else: The function is difficult to use right and programmers do the easiest most-obvious thing they can. oh i need to get a string, so i'll use gets... but i need two strings so....

Anyway, I'm not aware of any good rules for choosing which way to go: Less total effort seems like a good metric, but this is very hard to estimate when you live on an island and don't know how other people use your software.

Memory corruption is serious though: It's often very easy to turn into a security hole, so I generally advocate doing something. All of the people just disabling this security feature make me nervous. I wonder how many of them run websites that I use...

[1]: http://www.bitsavers.org/bits/TI/Explorer/zeta-c/

[2]: https://news.ycombinator.com/item?id=2281932


Wow, somebody remembers Zeta-C! Yes, you're right, 'gets' was safe in Zeta-C. (I'm the author.)


Modern brk() can return -1 to indicate there's insufficient space above the current break address to allocate the requested additional space. I assume "in the early days" brk() could also fail to allocate space (after all the space available is not infinite) so "perfectly safe" comes with caveats.


Surely that gets(sbrk(0)) will not work since sbrk(0) returns a pointer to unmapped memory. Maybe you wanted sbrk(BUFSIZ)?


Well, no actually: That still limits you to a gets of BUFSIZ.

Just as we often grow the stack with page faults, you could grow the heap with page faults: Modern UNIX doesn't because this is another one of those "almost certainly a mistake", but:

    signal(SIGSEGV, grow);
    void grow(int _) { sbrk(PAGESZ); }
should work.


fork() is not gets and you don't f*ck with process creation. If a programmer commits an error that deadlocks or corrupts data by using fork() unsafely they really should have known what they were doing in the first place.


I honestly prefer an early crash to random memory corruption


Apple isn’t breaking old/existing binaries, the new behavior only applies to binaries compiled against the 10.13 SDK.


is that so much better? I dread every new release of macos because it always breaks some part of my toolchain (either gcc and friends, or valgrind or whatever). It wasn't so bad when a new os came every few years. now they break stuff every year. six months since I bought a new machine and my gdb still isn't working quite right.


Apple is not responsible for every piece of third party software, especially niche tools like gcc, gdb and valgrind. It is up to the maintainers of those tools to keep up with new macOS releases. Beta releases of macOS are made available early to developers for exactly this reason.


> It is up to the maintainers of those tools to keep up with new macOS releases.

Why should stable tools which has been working for decades constantly have to change their source-code to keep working on Apple's OS, a piece of software whose primary task is running software the user wants to run no less?

Apple should make sure the user's software runs on its OS, not the other way around. Stop apologizing for what is essentially the world's wealthiest corporation being lazy and putting the maintenance cost of its own OS on everyone else.


backwards compatibility has a high cost over the long term - if you're not supporting it, you're free to make your software better at your own whim. That's a strong postition to be in.

I would say the primary purpose of macos isn't to cater for everyone, it's to make using a computer a pleasurable experience for the non technical. It's about bringing computing to the masses.

If you're in a mahochist niche, good for you but you're better off on linux.

For me, macOs is pleasurable for the most part, and i'm technical enough to work around any problems I face. On windows/linux I have to do that for features that aren't even nearly advanced.


EDIT: It would be great if downvoters took the time to indicate which part specifically they think is wrong. I've edited the post to remove unnecessary antagonizing parts. Remember that this is merely my opinion.

> features that aren't even nearly advanced

Windows, while not a UNIX basis (which can be an actual problem for technically-inclined users, eg developers) is vastly more advanced that macOS, especially for business users. Generally for maintream users, Windows or macOS are both intuitive after enough time spent using them; what's putting off is to switch from one to the other (eg shortcuts, etc.) Windows management (I mean the app's windows, not the OS) is notoriously bad on OSX these days compared to most other Desktop Environments.

Likewise for Linux, certainly the least user-friendly but second to none to perform advanced stuff (eg Deep Learning, or 'edgy' virtualization for instance involving GPU pass-through).

Don't get this the wrong way, macOS is OK for the masses, but on par with Windows 10 nowadays.

This is a personal opinion based on using all three OS daily, and from observing friends/family using any of them. As long as you make the right choice, there's no 'better' OS, just different pros and cons that suit each user more or less.

Most notably, I now have to troubleshoot my mother's workflow on OSX/iOS (simple stuff, mostly related to printing and sharing pictures/scans), it wasn't so a few years ago. As of 2017 I personally have a much simpler experience out of the box on Android+Windows.


You’re getting downvotes because you’re making statements that

a) show complete ignorance about modern OS design & technology and

b) conflating how “advanced” an OS is with you & your mother’s UI preferences.


I am sorry but I fail to understand your arguments. I assure you that I am humbly trying to understand and would like this digression to move past sarcasm. I'm guilty of the first strike in that regard, but I feel it's getting in the way of the discussion at this point.

a) I am willing to accept your statement (I am no OS expert, I have no formal CS training; I'm just a developer of rather high-level software and I've only been tinkering at home with computers for a short couple of decades); however please note that I was merely opposing the parent post's implication that macOS is the most advanced OS.

I suppose this is perhaps a matter of perspective: I define and judge "advanced" here not from a CS standpoint but rather from a real-world pragmatic standpoint: does it do the job, for whom, and how well? I observe that macOS isn't dominant in business nor in server rooms of any kind, and that Linux is pretty much the only relevant solution for most cutting-edge computing projects. Please help me understand how macOS has more "advanced features", as stated by the poster I was replying to. I sincerely fail to see what macOS has on Linux or Windows nowadays. I, for one, can't do anything better on it.

b) I see your argument as slightly derogatory, but let's move past that. Surely you understood that using anecdotal arguments, implying my mother of all users (!), had the evident purpose of downplaying my opinion to just that: an opinion, not a scientific judgment about the advancement of an OS; thereby implying that the parent post I was replying to had no more grounds than mere subjective opinions to make its statements. At least, none that I could find. There is no conflating of anything, but perhaps that was due to bad wording on my part, in which case I understand the negative reaction (but stand by my opinions, I vastly prefer Windows 10 UX to macOS as of 2017, and I should perhaps add that I was a 100% mac user from 2008 to 2016, at the notable exception of casual gaming which I've quit since then and does not even factor in my current opinion).

I'd gladly hear answers about the respective advancement of each major desktop OS because I'm truly interested in the matter, if only from a dev perspective (and obviously as a consumer/user).


> I define and judge "advanced" here not from a CS standpoint but rather from a real-world pragmatic standpoint

Which is completely subjective and also not what “advanced” is usually used in reference to when it comes to OSes. Perhaps you meant “intuitive”?

Either way it’s subjective so you could have just distilled both your posts (and points) to this:

“Personally, I don’t like it.”

That’s fine. You do you. No harm no foul. Would have saved everyone the essays & you typing them.

Related: you... overwrite. You’re incredible verbose for the amount of data you’re delivering. That can come across as patronizing or condescending. To use a $5 word: you bloviate.

I don’t say this to belittle; it’s merely feedback & trying to help. Tone can be hard in text.


I upvoted you for pointing out a relevant problem in my communication. I agree and do/will try to improve. Thanks for the feedback.

(In this case I wanted to be as formal as possible to convey respect. English isn't my mother tongue so I may tend to overdo it).


c) complaining about downvotes


Ha, obviously. : ) I know that. But I think I was careful enough to word my edit as "please explain"; nowhere do I complain about the downvoting itself. I accept it, period. I an genuinely interested to know which part is flawed in my opinion.


It never helps does it?


When shit breaks in the new version, people should stop upgrading to the latest macos!

Apple has found that they don't need to put effort into api backwards compatibility because people are still freaking upgrading. It costs money to ensure backwards compatibility, so not doing it is desirable. The end user has to put pressure on Apple to do is.


You missed the fact that fixing problems and not stacking technical debt for the sake of "backwards compatibility" is also desirable.


Any particular reason you decided to quote backwards compatibility, as if it's not a real thing?


Quotes are not only used to denote non-real things -- if anything that's a quite recent (couple decades) trend.


"recent"


Well, with the advent of irony it really took off.


Developers need to support Apple as much as they deserve, and really do they deserve much?

From Hardware i.e. the Mac Pro (Not innovative? How about just make one workstation for Video Production that is valuable) and the Touch Bar. Software, Final Cut Pro the past 4 years and the crashes I see in my apps. Then Mac OS' move to iOS touch gestures on touch pad but I use it in dock mode drive me crazy. Mac OS and OS X thing that was just incredible is the OPTIONAL case sensitivity. If you haven't had to work with Adobe products on Mac this might not be such a big deal but ugh.

My mind blows every time I have to use MacOS for some job. Sure they have a optional case sensitive command line that works almost like a Unix (See optional case sensitive in 2017), but it really just is a mess unless your workflow is like everyone else's. No flexibility if your working with a team.


You have to break things occasionally or you end up in backwards-compatibility hell where every bug is a feature that is absolutely essential to somebody's workflow.

https://xkcd.com/1172/


Well, it depends on your point of view.

Microsoft, and Linux (just the kernel I'm talking about here), have both decided the point of an OS is to run programs, so with each update they both heroic efforts to not break userspace, often adding code just to make sure old programs that did weird things don't break.

Apple have decided to go a different route, and leave a trail of programs just a few years old that are forever unrunnable, as they won't even distribute old copies of their OSes. However, it seems many users are willing to take that choice, as we can see from their success.


> they won't even distribute old copies of their OSes

They do, actually; if you bought a copy it should be available in the App Store. They even sell physical installers for old operating systems: https://www.apple.com/shop/product/MC573Z/A/mac-os-x-106-sno...


That was the policy for several years, but this year they changed it - the Sierra installer is no longer in the App Store after you upgrade.


gcc isn't exactly a "niche tool".


It isn't the official compiler on Apple platforms either, that ship has sailed long time ago.

Google has also shown the door to it on their OSes.

Lovers of BSD licenses will miss FSF software in a couple of years.


The really sad part is that as their freedom decreases they seem to just accept it as a new normal rather than as a cost that they chose to pay, and the people who know or even sometimes just remember things differently end up ignored as if they were ranting about some crazy impossibility.


Let's take a look at an alternate reality where Clang didn't exist, and GCC was the only compiler worth using–wait, we don't have to do that. Just look at Bash. It's neglected and stuck on 3.2.57 forever because Apple doesn't want to deal with GPLv3. Do you prefer that to the BSD-licensed solution that LLVM is, where it's still open source and actively maintained, rather than left by the wayside because Apple just refused to play nice?


Bash is doing just fine; it's not stuck anywhere (4.4.12 on my machine). Mac users are neglected and stuck, but Bash isn't.

> Do you prefer that to the BSD-licensed solution that LLVM is, where it's still open source and actively maintained, rather than left by the wayside because Apple just refused to play nice?

I'd rather have the time, money and energy go into a free software commons. Anyone contributing to LLVM is enabling proprietary software. No thanks.


> I'd rather have the time, money and energy go into a free software commons. Anyone contributing to LLVM is enabling proprietary software.

Am I clear in understanding that you're statement is "people contributing to this piece of free software are not contributing to a free software commons"?

Because that makes no sense to me, and it's hard to avoid interpreting the dismissal as pure zealotry.


They are also contributing to proprietary software that uses their software.

Being against that, is a dogmatic approach that accepts no contributions to the cause that also help other causes because only the cause may win.

I'm not really for it, but I understand where RMS and his allies are coming from. Proprietary software makers tend to view their competitors, free or otherwise, as an existential threat, so it is "fair" for the free software folk to view them via the same lens.


The part you are overlooking are the contributions that OEMs stop doing, because thanks clang they no longer have to.

Not everyone contributes back to LLVM thanks license.

Just like the *BSD hardly see the same contribution from all those embedded devices, as Linux enjoys.


People using BSD licensed code have an incentive to contribute changes that aren't "secret sauce" to make maintaining their forks easier. Sony has pushed code they wrote for the PS4 back upstream to make keeping current with upstream easier, for example. Juniper still pushes patches upstream, and they've got a much larger team working on Junos than Sony does on Orbis.


FOSS religion is only going to fade as we get further from 90s Microsoft.


obvious troll is obvious.

specially with that username. lol. thanks for the laugh.


How many people who buy an Apple computer need to install and run a C compiler?

EDIT: BTW, it is trivial to install clang without Xcode on a pristine Mac. Just typing cc in Terminal will pop up an installer that will download and install clang (and possibly other tools)


not everything needs to be used by the end-user directly. I bet gcc is used a countless amount of times just to build OSX anyway.


Clang is used to build macOS and 99% of macOS/iOS applications, not gcc, and Apple maintains clang for you.


Ah my bad, I did not know that.


OS X uses clang.


[flagged]


This crosses into personal attack, which is a bannable offense on HN. Would you please read https://news.ycombinator.com/newsguidelines.html and post only civil, substantive comments from now on?


If I ruled MS or was a big boss of Raymond, I would ask him to show a big screen-dimming warning before running these programs, like “This application uses unsafe methods of interacting with your computer and works only because we decided to support it, spending two weeks to make it work. Still no safety guarantees, btw. Please ask Vendor, Inc. to read http://doclink on how to do his shit the right way.” [Okay] [Damn, FTS] [Send email for me]

No chances I make it to the top of MS though.


As he nicely explains in some other blog posts, sometimes Vendor, Inc. went out of business in the 80's but the apps are still in use. So you'd just annoy the poor dweeb from Human Resources that has to use the OS and app provided by his employer, the only job he could find in Podunk :)


It is very different but I wouldn’t argue better. It’s lead to a lot of their problems with Windows APIs


that is exactly what I dislike about Microsoft. The right answer is to detect those dodgy applications and disable them, and notify the user that the program is "bad", thus providing a severe disincentive to software authors to ever do such things again.


That mechanism also exists, notably in Vista and 7 there were a number of older applications where Windows would display a warning prior to running it that the application has known incompatibilities and may not run correctly.

However, a user who just upgraded Windows and half their programs stop working will simply not upgrade. In many cases such dodgy applications are also never fixed because the vendor doesn't exist anymore, doesn't care, or the flaw is in some library they use and they don't really want to invest that much time to change it.

Besides, detecting such behaviour is pretty much the same as fixing it with a compat shim, since you have to do more or less the same work. So from MS's perspective the benefit of working around the buggy behaviour is much more prominent than trying to discipline developers by making users suffer.


Why should users pay for developers' poor choices? This is a great way to piss off people who would blame Microsoft.

There isn't a great answer to this that satisfies everyone.


Huh? That is kind of the deal isn't it - you pick your developer and if that developer makes bad choices, you get a bad product.

If we don't push back against bad practices by developers we never improve the quality of the ecosystem.

In this specific instance, the user has a choice - don't upgrade.


> the user has a choice - don't upgrade

That's how you get everyone stuck on Windows XP. There's no good choice for the user here–either you're on an old OS that's missing features or security updates, or you can't run software that used to work.


And thus we end up, over time, with an OS that is a pile of hacks upon hacks to workaround broken apps, severely limiting it's ability to provide a clean, sensible, performant set of OS services.

I regularly read The Old New Thing and just shake my head in wonder at how much more MS could have accomplished without that baggage.

I would much rather users were occasionally forced to upgrade their buggy software.


> I would much rather users were occasionally forced to upgrade their buggy software.

Yes, that's the ideal solution, but backwards compatibility overall is a huge can of worms that doesn't really have a good solution. In this case, it might not be possible to upgrade the software (e.g. it's unmaintained, or was contracted out, etc.)


Note that nowadays there aren't many of such compat hacks in the actual Windows codebase. Most applications can be coaxed to work by just shimming a few API calls and those shims can be kept separately and don't impact other software that doesn't need them. The simplest one of those would be the one that simply pretends to be an older Windows version for broken version checks to still work. There are others where HeapAlloc simply allocates a little more to prevent a buffer overflow, etc.

The OS architecture no longer suffers from such things.


That’s a very rosy outlook that doesn’t match with the few Windows devs I’ve spoken to.

It’s better now - because they moved to abstracting a lot of that stuff as you said - but the number of man hours spent on it and the knock on effects of the overall system design is non-trivial.

Plus, if they’d not had that policy they could have been where they are now in the early 2000s

Kind of like OSX was.


I think you fail to understand just how involved, expensive, and business critical some of this dodgy software actually is. Some of this crap software costs millions of dollars and takes years to upgrade.

Ignoring even the security implications, you can't simply not upgrade either as software doesn't exist in a vacuum. All the other software/hardware involved might need newer versions or bug fixes. Microsoft doesn't even support newer Intel chipsets in versions of Windows older than 10 -- that's only even possible because 10 is highly compatible with older versions of their OS.


Apple does this too. Run "strings" on various libraries in Apple's OSes and you will find plenty of instances of bundle identifiers of non-Apple apps.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: