Hacker News new | past | comments | ask | show | jobs | submit login
NeXTstep Manual, Systems Programming with Objective-C and Driver Kit (1995) (nextop.de)
120 points by pjmlp on June 5, 2016 | hide | past | favorite | 78 comments



Relevant video from 4 years earlier: https://youtube.com/watch?v=j02b8Fuz73A

As someone who has developed for Apple platforms for the better part of a decade now, it's pretty amazing just how little Cocoa/AppKit have changed over the years. Only in the past couple of OS releases are we starting to see some larger shifts. The crazy part is that this hasn't really hurt the APIs at all; sure, some bits are starting to look a little dusty and could use some attention, but even so Cocoa (both mobile and desktop variants) remain the best UI toolkit for consumer-facing application development. Qt, GTK, etc get the job done but aren't nearly as pleasant to use.


I got my first mac laptop a few months ago and I dabbled a little bit in OS X development with swift. Outside of raywenderlich.com I had a lot of trouble finding up-to-date resources/tutorials. I realize part of this is because Swift is moving so quickly, but it seems to be that Desktop-based application development is eons behind web and mobile application development in terms of educational material, and that it is probably holding the ecosystem back IMO.

Is it the case that developers prefer to write API documentation over tutorial/implementation based documentation that is common with other platforms (mainly web)? I feel like this approach can be debated on its merits, because it definitely raises the entry level for newcomers.


TL;DR - the web is just easier because it's so highly abstracted, and thus attracts a larger audience of new developers - and materials aimed at them. Educational material for desktop developers is plentiful but often at an advanced level only, because...

I think part of it is that the barrier to entry on the web is much lower. I've developed in objective-c forever, along with any number of other languages and platforms, and there more power available to you, the more you need to understand the underlying architecture.

The web is about as abstracted from the machine as it gets. Add to that that web rendering engines and HTML, CSS, and even JavaScript are very, very forgiving, so it's just easier for someone new to get started.

I was actually excited about the first couple iPhones in terms of resource scarcity to programmers. Tons of people jumped into iOS programming and for the first time in a while (or for many, the first time ever) were forced to consider what a lack of RAM and an underpowered CPU meant, and I helped coach many people through the finer points of memory management, allocations, (even with ARC) and Big O algorithm considerations.

Finally, native desktop software is just more powerful, and their is so much more you can do with it (and so, much larger standard libraries, SDKs, etc.) File IO isn't something you consider much in JavaScript on the web. Neither are parallelism, race conditions, IPC, networking, OS resource allocation, long running processes, native drawing and rendering, and so on.

Desktop programmers need to consider all of this a lot, and so generally tend to have some experience before getting into it (or work with a mentor/co-worker with a high degree of knowledge at first, as happened with me even post-CS degree). So materials written for desktop development tend to assume a certain pre-existing level of software development experience, and aren't aimed at new developers.

Edit: as someone else said, Big Nerd Ranch stuff is pretty good.


Yeah, at least in the time I've been doing Cocoa development it's kinda always been that way. OS X Cocoa tutorials have never been much of a thing, meaning that if one wants to learn it they're going to have to dive into the documentation and scrape bits and pieces from random blogs. It's just something one gets used to… The insurgence of Cocoa Touch tutorials in the past few years felt almost strange from my perspective.

On the bright side, much of what one learns in iOS tutorials transfers to OS X, even if it doesn't seem that way. The biggest difference IMO is the swapped priorities of windows and view controllers. On OS X, view controllers haven't done much until very recently and NSWindow is something of a cornerstone, since most applications use more than a single window. On iOS, it's all about view controllers and most apps use only a single UIWindow.

This makes me curious of what kind of demand there is for high quality desktop Cocoa tutorials and instructional material…


As a beginning iOS developer, I found the Big Nerd Ranch iOS book very helpful. It leaned more towards a tutorial style. I see that they also have a book called "Cocoa Programming for OS X". If I were getting into programming for OS X, that's the first resource I would check out.


> ..it seems to be that Desktop-based application development is eons behind web and mobile application development in terms of educational material, and that it is probably holding the ecosystem back IMO.

Yes! I have the same experience. I am not OS X developer, though a few times I wanted to make a simple app and on OS X it was not so easy task as to grab something like python (+qt, tk, wx, etc.) or electron.js .

Personally I think that OS X development needs a lot of introduction just to begin with and some APIs are tricky and "enterprisy" (though I believe it's easier to maintain in the long run). With iOS it seems to be much easier and way more resources.


> Cocoa (both mobile and desktop variants) remain the best UI toolkit for consumer-facing application development.

I can only assume you never heard of WPF.


They might have if Microsoft wanted anyone to hear of it.

It was/is a great tech, but has MS used it in any of their own stuff besides Visual Studio and the discontinued Expression suite?

Cocoa/AppKit is used throughout macOS and all of Apple's own apps.


I must add I really loved WPF and really wanted to see an OS with an entirely WPF-based GUI (which is what one of Blackcomb's original goals was, I think, while WPF was still "Avalon").

The last time I was in the Microsoft galaxy, WPF had been dogfooded in barely 2 of MS's own apps, Silverlight had been deprecated, and the new "Metro" and WinRT or whatever the new platform for tablet-like apps in Windows 8 and 10 was called, had all but killed WPF. Maybe someone who used to actively develop WPF apps can shed better light on its fate – I'd almost love to be wrong on this.

On the other hand, apart from having a vector-based UI (though I suppose they don't need a vector UI; having solved high-resolution displays better than Windows' scaling does) Apple's development technologies were already everything Microsoft wanted to be, and apparently admitted as such in their own internal emails. [1]

Indeed, to my knowledge pretty much the entire userland of macOS/OS X is in Cocoa, from the Dock to the Finder to the Automator, and even their Big Apps like Final Cut and Logic have been made with Cocoa.

[1] https://news.ycombinator.com/item?id=11828498


I worked at a job that had NeXT machines on every desk. Using this OS was the best computing experience I have ever had. For 8 months I used the Mail.app, the VarioData based customer database, etc. Everything worked great and the "flow" was unparalleled.

Concerning CPUs etc a 33 mhz 68040 and 32mb RAM worked pretty well. Grayscale was adequate, though I did appreciate having the upgrade to color. More RAM was always better.


A place I worked at did a trial of NeXTSTEP back in the 90's when they ported it to other architectures. I got to play with it on a high end HP PA-RISC workstation...it as amazing.


Exactly how advanced was NeXTstep in its time, compared to say Windows 95?

The object oriented API alone seems to me like a huge step ahead. Imagine if Steve Jobs never left Apple and got to ship these things under the better-known Mac brand, possibly stealing a lot of Microsoft's thunder.

EDIT: It's also interesting, and maybe ironic, that the legendary Doom which went on to make PC/DOS a relevant force in gaming, was developed on NeXTSTEP, with some parts coded in Objective-C. [1]

[1] https://en.wikipedia.org/wiki/Development_of_Doom#Programmin...


>> Exactly how advanced was NeXTstep in its time, compared to say Windows 95?

Keep in mind that the NeXT computers came out in 1988 and 1990, 7 to 5 years ahead of Win95. They were powered by Motorola 68030/40 CPU's, which were competitors to the Intel 80386/486, and they included 8-32MB of RAM. When Win95 came out a lot of machines only had 4MB still. They also had a 'megapixel' display with 1120 × 832 resolution. Most PC's were shipping with 640x480 VGA graphics at the time.

Price-wise, a NeXT machine had a base price of $4,995 and you could get a Win95 capable PC for $995. But this of course was years later.


A NeXT in 1988 was like getting a PC from 1995. The developer tools, os and network stack were like getting something from 2005. Aside the size of the box and monitor, it was a preview of the future. It took the PC industry seven years to come close... but still not quite match the NeXT. Display Postscript and a general purpose DSP (digital signal processor) made for some amazing capabilities (i.e. software driven modem, oscilloscope apps, etc...).


It was a great experience, I could only see a cube for the first time in 1999 as I had to port a particle engine from Objective-C/Renderman to Windows with Visual C++ and OpenGL.

But Windows 95 was away ahead from other UNIXes in terms of IDE and GUI development experience. Most had nothing more besides Motif + C to offer.

To this day Mac OS X is the only UNIX with SDK tools that can rival the likes of VB and Delphi in native development offered already in Windows 3.x.


> To this day Mac OS X is the only UNIX with SDK tools that can rival the likes of VB and Delphi in native development offered already in Windows 3.x.

If the Gnome foundation would focus on Gtk+Vala+Glade it could have been a fantastic development platform. I don't know much about the state of Vala today though, too bad because it's blazing fast and can use any Gobject library. Unfortunately the docs are just bad compared to Apple or MSDN, the need to find a better way to organise the information on their site or move to a new CMS...


I honestly still don't understand why MS didn't push their version of 'interface builder' they had with VB6 after they went .net. In my view .net was such a step backward - a huge library to load even though the OS comes from the same vendor, way more boilerplate than VB. Why not give VB6 decent networking and database capabilities instead?


The problem with free UNIX clones is that everyone does its own thing, as such you don't have a nice SDK that the community at large can enjoy and effort get scattered all over the place.

Picking Gnome as example, not only do you have what you have mentions, but you also have Anjuta and Builder.

Although Ubuntu now kind of has an SDK based on top of Qt/C++.

However none of them close to the developer experience that Delphi and C++ Builder used to have. Or nowadays the .NET and UWP stack have.


>To this day Mac OS X is the only UNIX with SDK tools that can rival the likes of VB and Delphi in native development offered already in Windows 3.x.

cough Lazarus cough.


Does Lazarus enjoy some kind of component eco-system as Delphi used to have?


Most definitely - admittedly not to the extent of the Delphi halcyon days, but nevertheless still extant.


I like to think about that possibility, sometimes .. in my alternative universe, Steve stayed at Apple, Microsoft became more like Sun (fizzled out in the late 90's), and SGI end up being the one to make the tiBook in all its lovely glory, ushering in an era where the sgiPhone is what we all have in our pockets ..


Things like InfiniteReality architecture and visual-area networks were pretty wild. A nice pdf for you:

https://diglib.eg.org/bitstream/handle/10.2312/EGGH.EGGH97.0...

I'd have loved to have cheaper and cheaper versions of NUMAlink in my desktop with stuff like their FPGA boards and graphics pipes. NUMAlink cards instead of PCI cards. Microsecond latency with almost 80GB/s speeds. Cellular IRIX-style splitting of system along multiple boards for max reliability and efficiency. Nice, alternate reality. :)


That was a nice PDF, thank you. It hurts to see SGI go the way of the Dodo, but it enlightens me to realize that this is The Way in computers. Not all who excel, persist.

(BTW, I'd have been quite happy with an SGI Indy laptop, y'know .. like the one we saw in the movies, a year or so before Steve did the tiBook..)


Multiuser Unix with full preemptive multitasking, memory protection, and with a security model, vs. single-user DOS shell with only partial preemptive multitasking, no memory protection and no security model.


Most of that stuff didn't matter for the single user use case of Windows back then.

Also unless you are speaking about Windows 3.x, your statement about prempemptive multitasking and memory protection is wrong.

I develop software for Windows since the 3.0 version.


> Most of that stuff didn't matter for the single user use case of Windows back then.

It mattered just as much as it matters now, and it will always continue to matter. People didn't know that it mattered because people didn't know anything better. After all, for the vast majority of people crappy DOS-based Windows from Microsoft was the only thing they could afford to use (in every sense of the word, not just financial sense).

> Also unless you are speaking about Windows 3.x, your statement about prempemptive multitasking and memory protection is wrong.

No. 16-bit software was cooperatively scheduled, and there was a lot of 16-bit software in the Windows 95 era. Even significant components of the first version of Windows 95 were 16-bit. Hell, lots of drivers were real-mode drivers!

Software used virtual memory, but there was zero, nada, nil memory protection. In fact it was pretty standard for programs to stomp over each other's memory as a "feature".

> I develop software for Windows since the 3.0 version.

Who cares.


> Who cares.

Whoa. This breaks the HN guidelines. Please edit personal nastiness out of your comments here.

Your comment would be so much better without that last bit.


No, his mention of his experience working with Windows 3.0 is completely irrelevant in this discussion. It's just a very weak attempt at appeal to authority. And if you look in his message history, it's something he does consistently.


Even assuming all that is true, it doesn't change the requirement that you (i.e. everyone) be civil when posting here.


Ah, the Win16Mutex. My personal favorite however (especially when discussing the OS/2 2.0 fiasco) is how the continued dependence on DOS allowed Caldera lawsuits to continue.


I'd love to hear more about that!


Not everyone was running old 16-bit software.

As for who cares, the customers that value my detailed knowledge of how the Windows stack works, since I don't mix technology with OS religion.


> Who cares.

Looks like someone hit a nerve there. Downvoted.


It depends on what "advanced" means. Windows 95 ability to provide backwards compatibility was almost certainly a significantly greater technical challenge than any problem that NeXTstep solved all on its own.

NeXTstep is interesting, but it was largely an integration of existing systems in a greenfield landscape. For example, CUPS remains a useful design today. Back in the days of NeXTstep, it required a high end printer. Windows 95 would [for practical purposes] print to any device or from any application that DOS or Windows 3.x would.

Which is to say that "advanced" depends on whether being able to use an existing impact printer with a new operating system is advanced or if not being able to is.


> CUPS

Hmm...CUPS came way after NeXTStep, way after the NeXT's acquisition of Apple acquisition in fact (wikipedia says 1999). NeXT itself just used lpr/lpd, that was it.

Of course, being Unix, you could hook into the lpd system and add your own filters. NeXT itself did this, using the on-board DPS to RIP for the NeXT Laser Printer (a Canon laser engine) and the NeXT Color Printer (a branded version of the Canon 360 dpi bubblejet). There was a quite a bit of trickiness involved, contacting the WindowServer from a background process and using high-resolution/high-bit-depth rasterising, but it was doable.

Third parties could also do this, for example I made drivers for anything from low-end Epson, HP and Canon ink jet printers (including a more advanced driver for the NCP) all the way to high end proof printers, film recorders and color laser copiers ( $50-90K items at the time). Others hooked up phototypesetters. So you definitely could print to anything as well, and everything would then come out in beautiful Postscript Level 2, with amazing WYSIWYG fidelity.

While the straightforward approach of doing this did require a printer with a raster interface, most low-end printers had that and when there was a print-head, a line of buffering was sufficient. It was more the mid-level laser printers that were a problem, because lasers required a full-page buffer and some just didn't have that much memory.

One time there was a Mission Critical Custom App (MCCA) deployment at a large pharmaceutical company that was in peril because they had several hundred (or thousand? I don't remember) IBM laser printers that did not have enough memory for a raster bitmap...and no-one had thought of that beforehand. Oops. No worries, I created a driver that extracted the text (they were only interested in the text) and precise positioning information and then sent commands in the printer's command language (not PCL or anything else that was well-known) to position/output that text.

Of course, Adobe didn't like all of that printing, and so they continued to tighten the license agreement and hike the fees for print-use of DPS.


OS X didn't start using CUPS until 10.2. I don't remember if the 10.0/10.1 printing architecture was a carryover of the NeXT stuff or something new, but I don't remember it being well liked.


I'm a bit dismayed to hear backwards compatibility - solving problems of an organizations own making - compared favourably with new functionality.


Consumers like being able to run their old software, and in the nineties (and more recently) that meant running software dating back to the 16-bit era of computing. "Solving problems of an organizations own making" is a very goofy way of characterizing that.


I was commenting specifically on Windows 95, and the idea that its ability to run its predecessor's software as being an achievement rather than something one should just naturally expect. Especially considering many of the successive difficulties in this respect were due to short-sightedness in design of the original APIs and then successive generations of API changes, each set meant to replace the last


I've used Quickbooks for my business accounting since 1998. Double entry accounting hasn't changed but I upgraded to Quickbooks 2004 when TurboTax stopped importing older versions with the 2003 Federal Tax season [but that's not where I am going with the API thing]. I still run TurboTax 2004 [see the aforementioned nature of double entry accounting][1].

Anyway where I was going is that as a user, I don't give a shit about API's and the quality of their design. I care about working software. The agile manifesto didn't invent the idea. Just pointed it out in order to change a world where "there are two types of people: programmers and their victims."

But as Faulkner might say, the past isn't even past.

[1]: It gets its own Win2k VM since it has some 16k code for backward compatibility. The VM doesn't connect to the internet because the world has changed.


I see your point about API design, but it's worth pointing out that everybody else was fighting a new revolution every five minutes and leaving behind heaps of the compatibility dead during that era. I imagine NeXT was probably the best of the lot when it came to making a lasting API and not breaking backward backwards compatibility, but perhaps someone who used it (and OS X) more could comment on that.


From the standpoint of business logic, existing working systems tend not to reduce to values equivalent to problematic. Conversely, spending money to replace one system with a more expensive system that provides equivalent value often does reduce to such values.


Well, the Internet was invented on a NeXT machine.


The web, not the internet.


Exactly, the old internet was already available in Xerox PARC.


No, Xerox PARC introduced Ethernet, not the Internet.


Man, so glad we have historians.

Slightly off topic, but this made me look at the efforts to preserve video game history in a whole new light.

Somebody has to become the historians of the early era of video games before it all bit rots away.


I was thinking about the concept of workstations connected over the network protocols, email, remote printers and file servers when I wrote it.


Advanced in what respect? Since I was an undergrad when Windows 95 came out, the thing that leaps to mind is that you could put together a Windows 95 computer for what you'd pay for some little doodad or minor software program for the NeXT platform. And if you spent a little more you'd have room for a Linux partition...


> Advanced in what respect?

As a platform to use and develop for, basically.


Well, it's funny. NeXT was obviously a pretty great platform to develop for (others can speak to that more than I can) but the tools on the PC side at the time were pretty good. Borland and Microsoft were still in a legitimate competition to deliver good tools at a low cost. Delphi wasn't terrible.

The big problem was that Windows 95 was such a crippled platform. In a lot of cases you'd want to do your real development and debugging on an NT machine and simply target Windows 95. (it was not in any way, as someone else stated, a "DOS shell" but it was still pretty bad)

From a user point of view, well, a lot of people thought the NeXT environment was pretty great. Windows 95 was easy to use but it was just MSFT's first step out of the primordial ooze. I'm not sure how much difference it truly made if you spent your whole day in Photoshop or PageMaker or Mathematica or whatever, except you'd expect the Windows 95 machine to crash a lot more often.


Lack of user accounts meant we could kill user32.exe, that had to be followed up by a reboot.


It'd be interesting to go back and run some of the software from the Windows 95 era (and the OS) on a higher-quality machine. I and most of my friends were using low-quality 486s and Pentium clones and it's certainly possible a lot of the problems had to do with hardware quality.

I honestly remember the later Windows 98 experience as being almost crash-free, and since the architecture was basically the same a lot of that probably has to do with higher quality hardware (and better drivers, of course). That was the golden time of the Intel 440lx and 440bx and ECC RAM support on desktop motherboards, when just about every blue screen could be attributed to those newfangled AGP graphics accelerators...


If your PC is faster than a certain speed Win95 won't boot.

This bit us at VMware, since it was common to want to run old Win95-only software in a VM on fast machines. We ended up having the VMM (Virtual Machine Monitor) patch up the bug when it did binary translation.


> If your PC is faster than a certain speed Win95 won't boot.

That's interesting. What's the threshold there?



The difference in the price of the hardware that ran the NeXTstep and Windows 95 was still so large that I don't think having the Mac brand would have made a huge difference.

And given what Jobs said about his experience leaving Apple, I don't think a lot of NeXT would have even been made at the time had he stayed. The focus probably would have stayed on Mac and Mac OS. And unless Jobs was somehow able to dramatically change the outcome of that platform, that would have actually resulted in Microsoft being ahead, as Apple wouldn't have OS X, but Microsoft would have Windows NT, which could have ended up being the most sophisticated mainstream desktop OS.

We might have been living in a world where something resembling Vista and Windows Mobile 6 dominated the market...


By the time Windows 95 was released, Nextstep ran on PC hardware.


While this is technically true, it's practically false. I got NeXTstep 3.3 running on a machine in the office by building it from the parts of no less than 4 other machines to get the system to meet requirements. And then it was only in (2-bit) grayscale mode.

Other than hardware compatibility, the OS and Objective-C + InterfaceBuilder development environment was far ahead of the mainstream.


There were a few vendors who sold "certified" NEXTSTEP compatible PCs. They were more expensive than a generic clone PC, but far cheaper than the NeXT hardware. This eliminated the guessing game on device support and typically the component quality was better than generic.


Had NeXT's software pivot been more successful, there would have been more incentive for hardware vendors to pick up DriverKit and write a driver for it.

They were still aiming for high quality, when there should have been an emphasis on quantity.


The implication seemed to be that Nextstep required some inherently expensive hardware to run which it didn't. If in some parallel universe Microsoft had come up with Nextstep, they could have launched it as 'Windows 95'.


The original version of NS for PC (3.1) required 16MB RAM and recommended 24 (32 in 3.2). The software itself cost $800. In 1993-94 there was a massive RAM shortage caused by a factory fire in Taiwan. RAM spiked to over $50/meg. The additional RAM to run NS alone would have been worth on the order of $1000 even in 1995 by which time prices had gone down.

All told, a NS system would cost around 3x the cost of a Windows system.


Yes, this is all true if I'd said 'NS 3.x as it was then, developed by Next, could have been a drop-in replacement for Win 95'. Which I didn't.


No way. Even if Microsoft had been prepared to throw out Windows 3.1 and DOS compatibility (or engineer it in at substantial expense), NeXTStep would never have run acceptably on the low-end target machines.

It was a decade after the '95 release before optimization and hardware advancement had a NeXTStep-derived OS running 'snappily' on consumer desktops.


Nextstep ran on 68030s. Had it been a Microsoft product (since the late 80s) there's nothing inherent in it that couldn't have made it a consumer OS by 1995.


I think one of the problems is the RAM prices back then, which also affected Windows NT sales. 4Mbit DRAM chips was for example costing more than $10 a piece.


That's true but the original Next machine shipped with 8 megs which by 1995 was the recommended amount for Windows 95. There are lots of reasons Nextstep was a commercial failure, I just don't think 'hardware requirements' is fundamentally very high on the list. 1995 was just at the beginning of a rapid decline in DRAM prices, to boot.


MS spent a lot of effort trying to fit Win95 into 4 megs for a reason. I don't think NeXTStep with only 8 megs were a popular configuration, many was upgraded to 16 megs.


Win95 didn't fit in 4 megs all that well. But that's neither here nor there since we're in a parallel universe in which MS started working on Nextstep in 1985 and released a consumer OS based on it in 1995. This seems totally plausible. Next Inc was never really making a consumer OS. In 2007, Apple shipped a Nextstep derivative that ran in 128 megs while its desktop (and consumer) counterpart wasn't really happy with anything less than a couple of gigs.


I miss named parameters and categories the most from Objective-C

What features do those of you who no longer work with Objective-C miss the most?


I learned to program from these manuals. It's wonderful to know they're available online. Wonderfully clearly written.


Ah, the good old days. I built a mini DB with the Indexing kit. What I discovered was that the kit itself wasn't 100% functional & some of the API/Function calls didn't work. But I had an amazing time programming on a NeXT computer. I was young (early 20's) and did lot's of trial and error with the IX kit.


Interestingly, I was just in my storage garage moving stuff and have the full NeXTSTEP 3.3 manual set with discs from 3.3 and 4.x in a plastic box. It weighs a ton.

It is still my favorite platform both from a programming and user point of view. I miss the menu system and still curse the Mac toolbar (particularly on my 34" 21:9 monitor). It was a joy to program and the amazing framework were a revelation compared to Win32 and its follow-ons. It is amazing what something designed for Smalltalk-like message passing and a runtime can get you. Windows 95 and NT were pains in the butt to program in C++ while VB at least was closer to NeXTSTEP in productivity. I generally prototyped my interfaces in Interface Builder and then redid them in VB if needed. Its sad how much was lost on the way to OS X.


The sheer purity of that UI still gives me the chills.


Indeed, the good old days. I has a slab for a while in 95 (actually, I still have it), it was pretty slow though for development. Running on HPPA was much better. Many fond memories.


My NeXT Cube still works great, though I need to try to patch it for Y2K sometime. :-)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: