Hacker News new | past | comments | ask | show | jobs | submit login
How Apple Programmer Sal Soghoian Got Apps Talking to Each Other (wired.com)
155 points by shawndumas on June 3, 2018 | hide | past | favorite | 77 comments



A couple weeks ago I wrote an Automator folder action that faxes any PDF dropped into it (https://github.com/EricWVGG/birchfax, if anyone is interested). Uses a bit of Automator, a bit of Applescript, a bit of Node. Fun little exercise and has been enormously useful to a friend whose life now depends on faxing forms to hospitals and insurance companies every week.

But the whole experience also left me sad about the Macintosh platform. Applescript is a hot mess of a language; it can be done in Javascript now, but that's largely undocumented. Thanks to poor accessibility of the technology, the support for scripting in apps is waning.

Automator, goodness, what a wild tool… but again, the documentation is horrible. Apple made two phenomenal technologies and then completely fumbled on the last mile.

The future of initiatives like Siri needs robust inter-app scripting. My only wish for the future of MacOS and iOS is a renewed push for this.

(okay, that and lock screen "complications")


AppleScript is an incredible tool that has many warts, perhaps the largest being how idiosyncratic its implementations are from application to application (each app defines its own AppleScript dictionary which is, essentially, a framework library).

There are also many idiosyncrasies due to how scripts are run by the AppleScript component manager and state may or may not be maintained between execution depending on whether the script has been compiled on the local machine, is run inside AppleScript Editor, is run in a particular version of AppleScript Editor, which version of AppleScript is on the local machine, etc. etc. [0]

Apple's official documentation is helpful, but the real documentation for AppleScript developers (I feel AS is feature complete enough that its practitioners can be regarded as developers) is Matt Neuburg's AppleScript: The Definitive Guide (from which the earlier citation comes). [1]

If you've puzzled over how to get something working properly in AppleScript and have been amazed at how indispensable AppleScript can be to generate a custom workflow using macOS applications, the command line, and your favorite programming language behind the scenes (Perl, JavaScript, Objective-C, etc.), you really cannot do yourself a bigger favor than getting a hold of Neuburg's book.

[0] https://books.google.com/books?id=x5AE8cxocdAC&pg=PA76&lpg=P...

[1] https://www.amazon.com/AppleScript-Definitive-Guide-Matt-Neu...

EDIT: remove extraneous word. Add period. Change “Applescript” to “for Applescript developers”


The "JavaScript for Automation" stuff is extremely incomplete, and to make matters worse, whatever dingus in XCode generates the documentation only knows about AppleScript, so sometimes you get object methods called properties and in any event the return type is wrong, etc.

Basically made it work well enough for a few things that ship on a default OS X install and stopped.


It also has the problem that there's no straightforward way to use external packages at all, which is a killer for efficient Javascript use.


This. This. And Absolutely This.

npm integration was an absolute must-have to ensure popular adoption; even I could see this straightaway. Alas, as last Apple bastion of Not Invented Here, the Mac Automation team insisted on rolling everything themselves.

All problems I told Sal about before JXA ever shipped. Told him about lots and lots of problems, which he pretty much ignored. Even wrote and sent him a quick-n-dirty JavaScript OSA implementation (https://sourceforge.net/projects/appscript/files/), by which time the man didn’t even have the common manners to say “thanks but no thanks”. A charming chap who talks great game, but a thin-skinned martinet.

There was no technical reason JXA could not have been 99.9% as good as AppleScript at application automation, and a million times better at everything else. 100% PEBKAC.


> The future of initiatives like Siri needs robust inter-app scripting.

That's a really good point. Siri would have enormously benefited from a robust, powerful automation ecosystem. Even if they don't use it directly it could have been the base for all Siri actions and the distance from automation to Siri would be much smaller than going all the way to Siri from where we are now. Apple missed a big opportunity.


+1 That Apple still can’t put 1 and 1 together shows just how utterly ignorant they are of the mind-bogglingly powerful product technology they’ve been sitting on for the last 25 years.

The great irony is that unlike painfully tedious low-level object-oriented API/IPC systems like DOM, CORBA, and SOAP, the Apple Event Object Model (the IPC architecture used in “AppleScriptable” applications) is driven via RPC using simple, high-level relational QUERIES. VERY powerful, amazingly (when it works right) elegant.

Apple event IPC has its issues—inconsistent implementations, notoriously inadequate interface descriptions, and a security model rocking it like it’s 1995—but it’s still remarkably close to being Siri-drivable, and has the massive benefit that it’s already implemented in hundreds of apps and is free for the taking.

(Or would be, if Apple hadn’t left Sal to run it into the ground.)

..

Not coincidentally, I put together a proof of concept for a hybrid Logo/AppleScript replacement a couple years back, designed from the start not just to enable traditional text entry, but also to be skinnable in a Workflow-like GUI or even driven by voice. Other commitments got in the way, but POC files are here:

https://bitbucket.org/hhas/entoli/src/master/README.txt

Key USP: true whitespace support in command names, making them very easy for non-programmers to read and write, particularly with modern autosuggest+correct+complete assistance (command interfaces are fully introspectable too).

Add Apple event capabilities (https://bitbucket.org/hhas/swiftae/src/master/) and wrap some friendly end-user tools around it, it’d be very powerful indeed.

Classic turtle example:

  Turn right 135°, pen down, move forward 2.5mm, turn left 90°, move forward 20cm, pen up.

Named “entoli” (phonetic Greek for “instruct” or “command”) in HT to Seymour’s Logo, but also nicknamed “SiriScript” for fairly obvious reasons. :)


There might be a great opportunity there to write a github with open source documentation for both

It’d be an awesome way to contribute to the community and I’m sure it would look great on a resume


Nope. Already been tried (https://github.com/JXA-Cookbook/JXA-Cookbook) and failed. I can count on the fingers of one hand all the people on the planet who could’ve delievered the user documentation essential to driving JXA’s early adoption and success, and there’s a reason none would touch it with a 20ft pole.

The JXA that shipped in 10.10 was a defective half-baked POS that was immediately abandoned to rot. Even if contributors knew enough about JXA and application scripting to produce quality documentation, who’s going to read all these new JXA docs when it’s already failed to win any users after 4 years on the market?

I lead-authored Apress’s Learn AppleScript 3rd ed, so know just what a Hell it is to get right. And I desperately wanted to do THE book on Automating your Mac with JavaScript—cos I knew JXA’s success or failure would decide the future of Mac Automation. (Also, MILLIONS of JavaScript users == 100,000s of book sales!:) But I simply couldn’t write a usable book about unusable shit.

So don’t waste your time.

Mac Automation as we know it is already dead; it just hasn’t stopped twitching yet.


Since Apple acquired Workflow I still have some small hope they’ll create a unholy fusion between AppleScript, Automator and Workflow for both iOS and macOS..


Well, we just came a step closer :) #wwdc2018


I'm quite surprised they fired Soghoian in 2016 rather than assign him elsewhere within Apple when they decided Automator wasn't worth it anymore (despite that they still have a page for it here: https://support.apple.com/guide/automator/welcome/mac).

He was obviously a brilliant guy, so I find it strange he couldn't have had a good fit anywhere in the company.


I wish they had moved Sal to the Siri team. Imagine the possibilities.


Agreed. That's the next step in making automation more accessible to most people. A personal assistant that automatically suggests workflows you can download would be a game changer


Millions more sales for Siri’s competitors.


I would love easy app automation, but applescript was and is the worst coding experience I've ever encountered. It basically boils down to dictionary and syntax guessing with no clear semantic.


Both the syntax and the built-in Script Editor application are unhelpful and confusing. I find even C++ more approachable.

In recent years they've added JavaScript bindings which solves the syntax and editor problem. But most documentation is still presented for AppleScript coders so you have to mentally translate what you want to the AS first and then to JS, so it's still basically useless without extreme mental energy. And good luck debugging.


Exactly. It was like the uncanny valley of programming. It was so close to English that you would think should work didn't. I never had that issue with HyperTalk back in the day.

But, Applescript was always just an implementation detail. The underlying Open Scripting Architecture has always worked with multiple languages.


The automation side of Apple has been moribund for many a year. It's almost like they can't bring themselves to make the hard decisions that they need to make: kill off AppleScript, burn out the dead wood, focus on standardizing on one system wide scripting language that is well documented supported by a first class toolchain.

Although in this case I think the rot started much earlier with the abandonment of HyperCard under Jobs 2.0. I remember in the 90s many small business built their own HyperCard stacks to run the business and it made a lot of people Mac owners even in the dark days of 92-97. Imaging having the same system today available across the platform, especially on the iPad. You can see the shoots of the idea there in the form of Playgrounds, Workflow etc but they need to think much bigger.


Yea, AppleScript was good when it was invented in 1993 with System 7 Pro. One early problem with it was the CFM port of ObjectSupportLib (68K used a static library), which took until the release of version 1.2 in 1997 before most of the major problems was solved.


Title is wrong.

William Cook and Warren Harris were the Apple programmers who created AppleScript:

www.cs.utexas.edu/~wcook/Drafts/2006/ashopl.pdf

Sal Soghoian was the Product Manager hired years after they left who ran AppleScript into the ground.


Sal is a world-class jazz musician, which is his real love.

Great man.


I thought he looked like a musician rather than a programmer. (Even if many programmers are also musicians)


Speaking of Sal Soghoian I’ve always loved this little party trick Applescript that sings Happy Birthday. I can’t remember the old Mac site I got it from but I remember him being credited for it.

https://hastebin.com/becorikaja.sql


“Saul whom you all know” https://youtu.be/8V67pNDRrDM

Happens at 9:55



I’m wondering if Swift Playgrounds (both the app and the Xcode pane) and the purchase of Workflow could be leading to an updated automation strategy. If they want students to learn Swift on an iPad, they must think it isn’t too complex for other stuff.

AppleScript has always been powerful at the great expense of being a pain to implement. And it has a sort of Visual BASIC kind of dependency among longtime Apple customers (probably why it never dies entirely).

When I abandoned AppleScript and switched to Python bindings, one of my reasons was that I wanted a more standard language with access to useful libraries. Swift could be that now...


Nah, Apple’s selling Swift to kids cos Tim Cook is a dingus who learned nothing of marketing while under Steve Jobs. Ol’ Steve would decide what consumer market he wanted to create, then beat all his engineers mercilessly to make the products that would fit that market. Tim just lifts whatever sack of crap they’ve got and tries to find suckers to buy it.

Swift is rubberized C++; an Apple engineer’s vanity project made good; a second-rate development language for Cocoa development, and a pedagogical joke as far as kids are concerned. If Tim thinks SP is going to save Apple’s ed markets from being eaten by Chromebooks, he’s even less fit as a CEO than Stephen Elop.

..

As to AppleScript, the pain for App developers was always in how to design and implement a robust, powerful, and user-friendly query-driven UI on top of their app, without Apple even telling them it was query-driven (not OO); never mind giving them a robust, powerful, and developer-friendly framework in which to build it.

Still a non-trivial technical challenge, mind (just ask SQL/RDBMS engineers), but Apple could’ve still made it a lot less painful (i.e. better docs and support) and a lot more worthwhile (i.e. tons more users). Make it worth programmers time financially and/or personally, and they’ll crawl over broken glass to do it. But Apple mismanaged AppleScript almost from the start: pissing off its key designers so they walked out, taking all their knowledge with them, and then downhill from there.

And as for all the popular languages that Sal _could_ have won over to Mac Automation, along with their millions of users for whom it should’ve been 100% pure Catnip for Geeks:

* Python (appscript, 2005—hey! have we met?)

* Ruby (rb-appscript, 2006)

* Objective-C (objc-appscript, 2007)

* JavaScript (“JavaScript OSA”, 2014; NodeAutomation, 2017)

* Swift (SwiftAutomation, 2015)

I wrote production-quality Apple event bridges for every one of the above, and tested, used, and supported most of them too. (We still use Python3+appscript ourselves—the stuff it can do is phenomenal.) And in the end I buried them all too, after Sal’s team, thinking they knew everything, stuck their own broken-ass garbage in Mac OS X and stunk those markets out. Part of which is my bad for being an utterly useless salesman; but mostly it’s on Sal for being such an arrogant ignorant and/or downright incompetent Product Manager for 20 years; and on Apple for not sacking his ass years sooner, before he ran Mac Automation into the ground for good.

Good times.


I have always hated the Automator experience. I'm never sure about the syntax ... it looks like English, but never what I can come up with. Most apps that I use don't publish their dictionaries.


Fired without any notice. Simply told his position doesn't exist anymore.

Since corporations are people, this one is a cold hearted bitch.


That's not what the article says. It says there were no warning signs they were going to fire him. You don't know if security just walked him from his desk to the exit one day or he was sent home with a goodbye lunch and a fat severance check.


> In October of 2016, he was let go from Apple after a nearly > twenty-year stint at the company. No warning, no early > signs. Apple just said his position didn't exist anymore.

Corporations expect unwavering loyalty from employees to the point of sacrificing their personal lives, only to be cut off from the company, with perhaps a 'fat severance' check.

To me that seems unfair. If the company has the right to abruptly cut off the worker, the worker should have the right to abruptly cut off his work at the end of the workday.

Somehow, in the US, such workers demanding fairness from companies are ostracized.


Perhaps, but you have no idea whether anything of the sort happened with this guy at his particular job. You're just taking this one sentence and using it to recite a generic indictment of US labour practices which isn't really related to anything in the article.


My point is that a 'fat severance cheque' does not right a fundamental wrong, erosion of worker rights.


I understand you want to talk about fundamental wrongs and the erosion of worker rights. It's just that there's nothing in the article that's really about that.


The US needs some law standarding a minimum notice/vacation/sick day policy across the board.


Workers do have the right to abruptly cut off work, don't they?


I’m sure he got his contracted notice. Suppose Sal had got another job offer, or just decided to move on, handed in his notice and left after his contracted notice period. Would you be just as appalled at the way he treated Apple?


...No? Why would I? Corporations exist to be leashed so society can derive benefit from them; people do not. To compare the two is to commit a category error.


So it would be ok to enslave a group of people, as long as you call that group a corporation first? A corporation is a group of people, and people can certainly be rude to a group that they are in just as much as a group can be rude to its members.


> A corporation is a group of people

No, it is not. Not any more than a person is a group of cells.

Corporation is a completely different creature than a person, even though it's composed of many people (though note that only partially, as corporation is also composed of procedures and processes, and probably some other elements).


The procedures and processes don't come from the outside of the group; they're decided on by the members of the group. This is true for all groups, even book clubs and people standing around on the sidewalk outside of a restaurant waiting for their reservation.


That’s at-will employment. Never happen in France.


What I am alarmed at, is the apparent casual nature of conversation we have about people 'positions not existing anymore'.

It no longer alarms anyone, no longer seems wrong to anyone that worker rights, which are the bedrock of a fair and just society, are being eroded right before our own eyes.

I don't advocate for socialism, but unchecked capitalism will lead to a fragile society, rife with periodic revolutions and collapses.

Lack of empathy for workers and the masses is the true cancer of democracy.


In what sense are workers rights being eroded?


Despite this sounding like some kind of demagogic propaganda, I kinda was with you until

> Lack of empathy for workers and the masses is the true cancer of democracy.

...what? This is such a incredible weird statement - like, dictators are more empathetic for the workers? You kinda propose socialism (while sounding like nobody should touch it with a ten foot pole) but not in a democratic manner? Might I point towards the EU, where the problem we're talking about is non-existent and also it's a democracy with socialist elements...?


I think what he meant was that the lack of empathy is the cancer that can kill democracy in the long run.


Not the original poster but I don't think he is decrying democracy. When he says cancer, it is not democracy he is talking about, he is talking about lack of worker rights which should be rectified in the context of democracy. I believe he is agreeing with his parent post – democracy with socialist aspect, similar to France and in turn agreeing with you - similar to EU.


This sounds like what Amiga people were doing with Arexx from 1987


Here’s a little screed I tried posting in Disqust. Not great, but I don’t have time to rewrite it right now.

------------------------------------------

Evening Wired,

Well, that story was a bit of a hot mess. Wrong title, buried lede (Workflow! WWDC!), unfocused and confusing. At least until compared to one of my screeds—enjoy!

has

--

1. Who’s the “One Apple Programmer [who] Got Apps Talking to Each Other” of your title? ’Cos it’s definitely not Sal Soghoian. Sal’s not, and never was, an “Apple Programmer”. He was an amateur AppleScript user and early evangelist, who lucked his way into his management job at Apple several years after Apple broke up the original AppleScript team. (More about Sal later.)

The Apple programmers who led the creation of AppleScript were William Cook and Warren Harris, who in turn were building on the HyperTalk language Bill Atkinson previously created for HyperCard and the Apple event IAC (Inter-Application, or Inter-Process, Communication) architecture then being developed for Apple’s first true multitasking Mac OS, System 7. Conceived around 1989, development started in ’91 and shipped in ’93, initially as a Pro extra, then as a standard feature from 7.1 on. Then Apple management gutted their department, reassigning half the team “to get OpenDoc ready to throw away” [as one internet wag drolly put it], and Cook and Harris quit in response. That the suddenly-orphaned infant AppleScript survived at all is a testament to its remarkable virtues; that using it causes as much hair-pulling as joy may be largely due to the subsequent lack of toilet training.

Dr Cook documented the full early history of AppleScript’s creation in the following paper:

www.cs.utexas.edu/~wcook/Drafts/2006/ashopl.pdf

Invaluable insights for anyone who wants to understand why Applescript was created and how it interacts with Mac desktop applications. (Hint: Unlike traditional object-oriented messaging systems—CORBA, DCOM, Distributed Objects, SOAP, etc—Apple event IPC uses RPC plus simple relational queries, more akin to SQL. VERY powerful, very user-oriented. Ubiquitously misunderstood.)

..

2. I’m going to speculate here that Apple will announce a new cross-platform automation strategy at WWDC next week, built on the Workflow app and team they bought in a couple years ago; hence the timing of this article. In which case, you really should’ve made the hot new Workflow App and its ambitious team of indie creators the centerpiece of your article; not some old failed automation product manager whose old failed automation products are about to be ousted.

As to Workflow itself, if/when it ships, no doubt there will be much fanfare and many claims that it’s the greatest innovation since sliced bread, but the truth is: it’s weak sauce at best; a gilded cell at worst. There is a vast history of end-user computing and computer education for explorers and innovators to draw on; far too long to get into here, but TL;DR: Papert was Right. Workflow, Automator, Scratch, Windows Workflow Foundation, et al are works of evangelical techies who think the way to evangelize their idea of “programming” to #OrdinaryPeople is to slather some lickable lipstick on their beloved pig and present it as “Programming for the Rest of [You]”. Whereas ordinary people would think, “WTF would I ever want to lick a pig?”, and work from there.

Papert demonstrated decades ago(!) that the right way to make computing accessible is to shape the language to fit its users; and the right way to do that is to give users the power to grow and shape that language by and for themselves, by growing its basic vocabulary with their own words and phrases for expressing the ideas and behaviors that interest them.

In other words: don’t hand the user a predetermined set of tools to use, but rather give her the tools she needs to make her own tools to use. Workflow, Scratch, &co don’t merely fail to empower users this way; they don’t even try at all. The resulting software may be a glitzy, seductive App that provides a lower barrier to entry than “Real Programmer” languages, but with an extremely low, immovable ceiling too. Papert’s Logo derived from Lisp/Forth for a reason: that stuff scales. Today’s mainstream programming world knows and thinks in “C”, which does not.

..

[cont…]


3. And now back to good old Sal, cos after 20 years I still absolutely adore Mac Automation—and absolutely fume at what that twit’s done to it:

“In October of 2016, he was let go from Apple after a nearly twenty-year stint at the company. No warning, no early signs. Apple just said his position didn't exist anymore. It's been thirteen years since Automator debuted on the Mac, and Soghoian's biggest champion at Apple, Steve Jobs, is gone.”

Oh Man, where to start?!

Obvious point: Having Steve Jobs one time say to you “that’s a great idea” does not him your “biggest supporter” make (as many an ex-Apple employee will attest). Yeah, Mac Automation caught Steve’s eye once or twice (as it should—there’s some incredible power in that mess), but Sal failed to parlay it into anything more.

.

“No warning, no early signs”? That’s funny. I warned Sal in 2015 that he was running Mac Automation into the ground. After years of watching his department ship badly designed, largely untested, poorly documented software, then promptly ignoring it once it was the door, it was obvious as hell to me. The man was—and still is—a charming end-user evangelist, but he couldn’t manage a pissup in a brewery; and I’m quite sure even Apple could see this. As with any large organization, 2000s Apple had little need to prune the deadwood that inevitably amasses while business is good. But once growth slowed and stuttered, it was only a matter of time till the surplus/underperforming employees were shown the door.

Sal was hardly the only employee purged; and had he being doing his job right he’d still be there! Create successful products that win new users and build new markets? Not a hard job description to understand, but Sal’s 2005-2011 tenure was mostly punctuated by mediocre-to-poor product releases that almost no-one subsequently ever heard about or used. Ironic that someone who made his reputation as a talented evangelist would, once promoted, lose the ability to promote or support his own products, or (even better) encourage his user communities to do the promotion for him. His title may have been “product manager”, but I don’t think he was ever a strategic planner or natural delegator, and I know from firsthand experience that he doesn’t take critical feedback well. Peter Principle much?

By 2012, with Automator having failed to grow market or mindshare and the the aging AppleScript market gone stagnant, I was publicly estimating the end of the AppleScript platform in another 5-10 years. When Apple presented JavaScriptCore at WWDC13 as a developer-friendly alternative to AppleScript/OSA for cross-platform embedded App scripting support, I could already hear the nails going in its coffin. Then, TOTAL SURPRISE: Sal’s team somehow managed to steal JSCore from the WebCore folks, and turned it into JavaScript for Automation (JXA), a full peer to AppleScript with just one difference: several MILLION existing JavaScript users already out there, ripe for the taking.

Just one problem: when Sal &co. unveiled JavaScript for Automation at WWDC14, it wasn’t close to finished, never mind working right. Having a decade’s experience in designing, building, using, and supporting AppleScript-quality scripting bridges out in the real world, so immediately offered Sal every assistance to get JXA ready to ship. I took six weeks out from my day job to test and critique the tar out of JXA while they were rushing to add all the missing bits, and even quickly made them a reference JavaScript OSA implementation to learn from (or steal). All I asked up-front of Sal was that he cover the cost of the developer account so I could download prerelease builds, and that he not waste my time. Knowing JXA’s success or failure would make or break Mac Automation, I was absolutely happy to give it, and he sounded absolutely delighted to accept it. What’s more, having lead-written the last edition of Apress’ “Learn AppleScript” book a few years earlier, I was also in the perfect position to get a “Learn JavaScript for Automation” out the door inside the year, and win him his market of JavaScript Automation users without him even having to lift a finger.

And Sal took this fantastic market opportunity that his employer entrusted him with, and cratered it. After the first few weeks sending him detailed descriptions of JXA’s numerous design and implementation defects, he simply stopped responding to me—presumably cos I kept telling him what he needed to hear instead of what he wanted to hear. When JXA shipped in Mac OS 10.10, it was flawed, buggy, unfinished software that simply wasn’t fit for purpose. Worse, as soon as it was out the door, Sal promptly abandoned it too. No bug or missing feature fixes, no community support; no proper user manual for existing JavaScripters to learn how to use it; and sod all marketing and evangelization, of course. Not even a public reassurance to early adopters that these issues would all be addressed for 10.11. Nothing.

One Million New Users.

That’s my blind estimate of what Sal Soghoian’s mismanagement cost Mac Automation, and thus his employer. So, no, I’m not surprised they sacked his ass. Only wish they’d done it years sooner, when AppleScript and its remarkable end-user Automation ecosystem was still healthy enough to save.

.

Ironic too, as Apple’s Siri continues to flail around looking for something useful to do (other than selling Amazon Alexa and Google Assistant instead), that Apple should all this time be sitting on the largest and most capable and mature user-query powered automation infrastructure in the personal computing world, and still can’t figure how to put 1 and 1 together.

“It used to be easy when we were 100 times better than Windows. But now that we're not, you don't know what to do.”

I miss those Jobs years too. Amazing times, many incredible lessons. Sadly, increasingly unlearnt.


DOIs, I haz a few.

In addition to creating high-end AE-driven automation systems for current and previous employers, I’m also responsible for:

- appscript.sourceforge.net (Apple event support for Python, Ruby, ObjC; plus JavaScript OSA component); Apple even considered Python and Ruby appscript for Mac OS 10.5

- bitbucket.org/hhas/swiftae (Apple event support for Swift)

- bitbucket.org/hhas/nodeautomation (Apple event support for Node JS; wrote it in a week just to prove a point)

- www.apress.com/gb/book/9781430223610 (tech editor on 2nd edition; lead author on major update for 3rd)


We live in a post-OS world. Having an 'Apple'Script for Apple-only devices isn't scalable.

A cross platform, OS, processor agnostic equivalent of Apple script should do well in today's world


This article fawns over a guy who quite mistakenly thought his terrible automation software was better than Windows, who then went on to "invent" callback URLs.


> This article fawns over a guy who quite mistakenly thought his terrible automation software was better than Windows

The technology behind it was (and still is) amazing. I'm not aware of anything in any other major operating system that even comes close.

> who then went on to "invent" callback URLs

No, that was Greg Pierce. Soghoian was still at Apple at that point.


> I'm not aware of anything in any other major operating system that even comes close.

ARexx on AmigaOS was pretty similar and shipped first (1990).


I wouldn't consider that to be a major operating system.


The AmigaOS was far more advanced than MacOS in 1990 just as the Amiga hardware was 5 years more advanced than the Mac at launch. There were about 5 million Amigas in circulation, especially in Europe, by 1990. How many home users owned Macs at that point?

Apple users seem to overstate the importance of early pre-OSX Apple. For example, The Apple II was brutally outsold by the Commodore 64 which was far more powerful and became the most popular 8bit home computer ever sold. (That is, the Apple II was far less important than the hagiography makes out)


As a child with a c64 at home, every school I attended had computer labs stocked with Apple IIes. I was often struck by the inferiority of the Apple II to the Commodore (to say nothing of how impressive the Amiga was). It couldn't even display monochrome text normally.


I wouldn't even go to comparisons. An operating system that for roughly a decade had several million users was a "major operating system", period.

We're not talking about some obscure research system here, that only very few people used. It was mainstream.


> The Apple II was brutally outsold by the Commodore 64 which was far more powerful

There's no doubt that the Commodore 64 had better graphics and sound hardware, but the idea that it's more powerful in general seems dubious at best. By the time 1983-84 rolled around, an Apple ][ machine was available with 80 column text, 128KB of memory, and a disk that transferred data 30 times faster than the 1541. Sprites and 3-voice audio are nice and all, but it doesn't make the C64 the machine I'd have chosen (even then) for anything other than games.

The 1541 disk is notably amazing because that blazing performance was the result of making the disk drive an entirely separate computer in its own right. This did at least make it possible to upload new firmware to the drive to improve the performance... but that was hardly standard. The only material advantages to the 1541 disk architecture was that it let the main machine be slightly cheaper, and it also enabled single disk drives to be multiplexed across multiple C64's.

From an educational point of view, it's also interesting to compare the difference in graphics programming models.

Drawing a line on the screen in Applesoft:

    10 HGR
    20 HPLOT 100, 120 to 150, 200
Drawing a line on the screen in Commodore BASIC 2.0:

1. Switch the machine to graphics modes by POKEing data into graphics hardware control registers

2. Implement something like Bresenham's algorithm to get the pixels on the screen.

One of these is something that an interested layperson (teacher) can communicate to a class of elementary school students. The other is something out of an upper division undergraduate computer science curriculum.

(Admittedly, Terrapin's wonderful C64 Logo made things a lot better for the Commodore.)

> and became the most popular 8bit home computer ever sold. (That is, the Apple II was far less important than the hagiography makes out)

It's important to consider the machines in their historical contexts. The Commodore 64 and Amiga were great machines at single points in time, but neither of them were created by an organization with the capability to effectively continue and promote their development. (ie: How much of the C64's success was due to setting its price so low that it couldn't fund ongoing development of the platform? Or due to short-term gains produced by a management strategy that alienated key allies in the long term?)

Viewed in that context today, Apple is now the most valuable company in the world and successful to the extent that I can look across the aisle of this train and see somebody watching a movie on an Apple product.

Commodore, in contrast, is happy memories, a bankrupt company that hasn't existed for decades, and a condemned Superfund environmental contamination site in a Philadelphia suburb.

While I have great personal fondness for the C64 machines, there are reasons the industry turned out the way that it did, reasons for Commodore's failure, and reasons for Apple's success... including the Apple ][.


Sorry, but no. Commodore’s failure is almost entirely related to terrible business decisions, but machine wise, the C64 was more powerful by virtue of flexibility. (The built in basic is irrelevant and could be easily upgraded to SuperBasic if you wanted that)

Ask yourself why 4 decades later, there is still a huge demo scene around the C64 and Amiga, but not the Apple II?

The C64 could do things on stock hardware that would have blown your mind in the early 80s as having come from the future: look at the 11 minute mark at the Onslaught Demo https://youtu.be/FTtKHLZTbtA find me an Apple II demo on stock original hardware that comes even close. (Full screen video and digital audio streaming from floppy at 15fps) my school had Apple IIs, they did not inspire me to learn to program, my C64 did, and I learned 6502 Assembly, not basic, because the graphics I saw in demos and games were jaw dropping and inspiring unlike anything a kid saw on an Apple II.

And less 1 year after the Mac128, Commodore launches what was the iPhone of 1985, vertically integrated custom chips and software, for half the price that did all this: https://youtu.be/q7rKj0DU8Xs

The Mac128 was priced way outside a home users budget, had an almost unusable Tiny amount of memory, non expandable, terrible graphics and sound. It was not a home brew system, or for kids, Or for engineers.

For those reasons Commodore computers were way more popular with hackers and with kids, excellent games, and when you chose to explore, excellent hardware to learn to code on.

Apple’s success was mostly due to marketing, and don’t forget, Apple failed and almost went bankrupt too.

BTW, the total number of Apple IIs sold were less than six million, including 1.25 million IIgs as opposed to Five million Amigas and 12 million C64s. In Europe especially, the C64 dominated.


> machine wise, the C64 was more powerful by virtue of flexibility.

What do you mean? Are you referring to the demo scene being able to push the internal hardware as far as they've been able to?

> The Mac128 was priced way outside a home users budget, had an almost unusable Tiny amount of memory

Probably better to consider a comparison with the Apple model I was alluding to in my earlier post. The Apple //c was much cheaper than a Mac, had 80 column text, 128K RAM, and could run Appleworks. It didn't do games nearly as well as a C64, but you'd have been hard pressed to find as good a machine from Commodore for small office tasks. (Maybe the C128... it was cheaper and had a numeric keypad, but I don't think the software was as strong there as it was for Apple.)

> Apple failed and almost went bankrupt too.

That 'almost' is rather important, don't you think?


> What do you mean? Are you referring to the demo scene being able to push the internal hardware as far as they've been able to?

Yes, the C64 VIC and SID were just flexible enough, to enable them to do a large number of graphical and sound effects which they were never intended to do, like arbitrary X/Y positioning of the screen, line-crunching/stretching and sprite-crunching/stretching.

Because the hardware was so exploitable in this way, it encouraged a never ending competition of creativity and innovation in its use that hasn't even subsided until this day. I mean, just in 2018, 44Khz digital audio was demonstrated on the C64: https://brokenbytes.blogspot.com/2018/03/a-48khz-digital-mus...

Now, search for Apple II's demo scene, and it never materialized except on the IIgs for obvious reasons, and even there, it's paultry.

>Probably better to consider a comparison with the Apple model I was alluding to in my earlier post. The Apple //c was much cheaper than a Mac, had 80 column text, 128K RAM, and could run Appleworks. It didn't do games nearly as well as a C64, but you'd have been hard pressed to find as good a machine from Commodore for small office tasks. (Maybe the C128... it was cheaper and had a numeric keypad, but I don't think the software was as strong there as it was for Apple.)

Most offices used PCs. Home users didn't have much need for much office tasks, but those that did, had plenty of apps available for the C64. I used to write all of my school reports using GeoWrite on the C64, no problem. If you really needed 80 columns for the C64, there were hacks to get it, including software only ones, and plugin cartridges, and as you mentioned, the C128, which had 100% backwards compatibility with the C64, plus the hi-res VDC. It was notoriously hard to program for, but here's a slick demo for it: https://www.youtube.com/watch?v=R8bEgX6n-QM Here's GeoWrite 128 in hi-res 80-column mode https://www.youtube.com/watch?v=PSNosi2C9Ic

When you say cheaper, you neglect to say HOW MUCH cheaper. The Apple IIc was $1300.The C64 was priced at $600, and the C128 was actually priced at $300. It sold 5.7 million machines (C128), more than all Apple models combined. The Amiga 1000 sold for $1300 at launch, compared to the Mac which sold for $2500.

The Commodore computers sold for less than half the price, had much better graphical and sound capabilities (in the Amiga's case, it's severely understating the difference, the Mac didn't catch up for years and at a huge cost differential because Commodore, like the new post iPhone Apple, developed highly efficient, integrated, chipsets).

Apple got squeezed in the home market by Commodore, Atari, and Sinclair, and squeezed in the business market by IBM, their lifeline was finding a niche in education. The hagiography of fans rewrites Apple's history, because they were first and because of the reality distortion field, but really, their computers were pretty shitty. The original Mac128K was inferior to the Apple IIgs in many ways, and it was hellishly expensive.

Seriously, go grab a Mac 128k, and grab an Amiga 1000, and tell me which computer you're more productive on. You could use an original Amiga 1000 today and still have a pleasant experience, whereas the original Mac is agonizing to use.

On the business side, up until 1984, Commodore made more money than Apple. Apple revenues took off with the introduction of the Mac, but this follows the usual Apple script of charging a ton of money for a less capable computer. Up until Windows 95, when the trend was reversed.


(You should keep in mind that I'm writing this in a spirit of friendly disagreement/discussion more than anything else... it's fun to think about these machines from my childhood... and I had formative experiences with both Apples and Commodores. :-) )

I think at least part of our disagreement is a disagreement on personal priorities.

Just to illustrate:

> Seriously, go grab a Mac 128k, and grab an Amiga 1000, and tell me which computer you're more productive on. You could use an original Amiga 1000 today and still have a pleasant experience, whereas the original Mac is agonizing to use.

Having used both machines, there's almost no chance I'd pick the Amiga, unless I wanted to play a game. The clearer display alone is a huge advantage. (In fact, I remember regretting the negative experience of a video producing high school friend of mine who I'd convinced to buy an Amiga 500. He had a much better experience a year or two later when he switched over to a PC.)

> Because the hardware was so exploitable in this way, it encouraged a never ending competition of creativity and innovation in its use that hasn't even subsided until this day. I mean, just in 2018, 44Khz digital audio was demonstrated on the C64

Very impressive and a testament to the skill of the developers involved... but I'm not sure how useful it is to ask people to wait 35 years for their hardware's capabilities to be fully realized. Whatever my current machine will have been convinced to do in 35 years time is not likely to matter to me all that much, and certainly means nothing now.

So, while the flexibility of the hardware is nice and all, the hidden capabilities are only useful to the extent developers are forced by resource constraints to find them. The Atari 2600 had the same sort of thing... people were making it do amazing things into the 80's because they had to do so to gain access to a market, not because they found it a desirable property of the hardware to be forced into it.

(Note that recent demo coders that use these platforms as ways to challenge themselves are in a different category altogether.)

> Most offices used PCs.

Until the introduction of Lotus 1-2-3 1983, it was very common for companies to buy Apple ][s for the express purpose of running VisiCalc. I'm not sure Commodore ever seriously played in that space with any of their machines, including the Pet.

> When you say cheaper, you neglect to say HOW MUCH cheaper. The Apple IIc was $1300.The C64 was priced at $600, and the C128 was actually priced at $300.

You need to add a some things to those machines to be feature complete with a //c... at the very least a floppy disk drive. (And then you need to accept a disk with 3-4% the transfer rate of the Apple product.... I spent a bunch of time in my youth waiting for C64 software to load off disk.)

> more than all Apple models combined. The Amiga 1000 sold for $1300 at launch, compared to the Mac which sold for $2500.

So that's the Amiga 1000, which would need a 256K memory expansion, an external monitor, and some work to connect it all up, to be comparable to the Mac. (Admittedly, the Amiga has color, but the Mac had 384 line display without interlacing and was more portable.)

> The original Mac128K was inferior to the Apple IIgs in many ways, and it was hellishly expensive.

The IIgs was introduced almost three years after the Mac 128K... why should it be surprising that it has some better capabilities than the earlier machine? (Probably more surprising is the fact that Apple kept the IIgs CPU as low as 2.8MHz.)

> the usual Apple script of charging a ton of money for a less capable computer.

This is a weird argument to make, because it essentially implies Apple has good enough marketing to convince people over 40 years to make non-economic choices. It seems more likely to me that Apple has, over the years, better been than most at assembling a set of capabilities that people find valuable enough to pay for.


> Clearer display

B&W 9" 512x384 vs Color Commodore 1080S 13" @ 640x256 or 640x512 interlaced and doubled as a great TV. Let's say you didn't want to use interlaced mode, the Amiga still was more productive because of its out of the box multi-screen functionality.

Flipping between screens on the Amiga was extremely fast, there was even a gesture for it (hold right click, then click) Multitasking is extremely important productivity.

I used to simultaneously do the following: Run a terminal program in the background for downloads or research, have my editor or productivity environment open in the foreground, and have auxillary utilities needed like graphic editors, music editors, running on background screens. It was years before MacOS even added cooperative multitasking (System7).

> but I'm not sure how useful it is to ask people to wait 35 years

Most of the innovative features did not require 35 years. Faster loaders, JiffyDOS, all existed in the 80s. Games already used sprite multiplexing, border knockouts, etc even as far back as the mid 80s. Digital audio was hipped in games like Great Giana Sisters, or Skate or Die. Have a look https://www.youtube.com/watch?v=XetgpWY7WOg The C64 market, like game consoles today, was one of steady progressive of capabilities. Each year, things got better on the same hardware, and so the value of the HW was extended. That isn't so for the Apple II, whose graphical and sound capability sucked at the beginning, and pretty much plateaued early.

For anyone who had bought a C64, they would have felt less need to upgrade year after year, because the software library continued to push the limits.

> I'm not sure Commodore ever seriously played in that space with any of their machines, including the Pet.

PET was a player until VisiCalc, but you're talking about a market size of a few hundred thousand machines. Most business was stilling being conducted on timeshare machines at this time.

https://arstechnica.com/features/2005/12/total-share/3/

>You need to add a some things to those machines to be feature complete with a //c... at the very least a floppy disk drive. (And then you need to accept a disk with 3-4% the transfer rate of the Apple product.... I spent a bunch of time in my youth waiting for C64 software to load off disk.)

If you had an Epyx Fastloader or Action Replay, loading a 200 block program (usual top end size) would take about 4-5 seconds. JiffyDOS could load a factor of 15 times faster, taking that down to about 1-2 seconds. (JiffyDOS replaced the ROM in the 1541)

> So that's the Amiga 1000, which would need a 256K memory expansion, an external monitor, and some work to connect it all up, to be comparable to the Mac. (Admittedly, the Amiga has color, but the Mac had 384 line display without interlacing and was more portable.)

Amiga 1000 worked without RAM expansion. It started with 256k (2x the Mac128). If you add the A1050 256k expansion, it adds $199. The 1080S monitor had stereo speakers, took both RGBI and Component Video, so I would use mine to for both my computer and to watch TV, which was much higher quality than regular TVs. The 1084 usually sold for $300. So for $1800, you got a system with 512k RAM, Color, Stereo Speaker Monitor, that also functioned as a high quality TV.

As an aside, I hooked my Amiga up to a regular composite monitor/TV before I got the official Commodore monitor.

>The IIgs was introduced almost three years after the Mac 128K... why should it be surprising that it has some better capabilities than the earlier machine? (Probably more surprising is the fact that Apple kept the IIgs CPU as low as 2.8MHz.)

Not surprising, the IIgs was Woz's baby, the Mac128k was Job's baby. They deliberately hobbled the IIgs because it would make the Mac look bad. If they ran it at full CPU speed, it would have been faster, cheaper, in addition to looking and sounding better.

> This is a weird argument to make, because it essentially implies Apple has good enough marketing to convince people over 40 years to make non-economic choices. It seems more likely to me that Apple has, over the years, better been than most at assembling a set of capabilities that people find valuable enough to pay for.

I think it's not really disputed that Apple has a cultish following, and that their marketing creates a reality distortion field. The Mac IIcx for example, was introduced in 1991 for $10,000 in 2018 inflation adjusted dollars ($5000 in 1991). You really think the IIcx was 3-4 times better than a PC in 1991 or an Amiga 3000/4000? For a narrow few, who could afford to drop $10,000 on a IIcx and $16,000 on a Laserwriter, the costs may have been justified.

But if you were doing say, video or sound work, an Amiga or Atari ST, or Silicon Graphics workstation were far better. Which is why, the Amiga's niche was DTV while the Mac's was DTP. But an Amiga with a Video Toaster was still a cheaper setup than a Mac with a LaserWriter, think about that. You could produce a show like Babylon-5 with Amigas, but with a setup costing 3 times as much, you could print some flyers.

I was typesetting documents with TeX in 1991 producing output far superior than what I saw coming out of professional Mac apps, and with far less intervention, for 10 times less cost.

It seems to me that the cost/benefit was not justified by Apple's markup on Macs, and you'd be better served by something 80% as good, but 1/2 the price.

Apple actually almost failed as a company three times, in 1993, 1997, and again in 2001, they had huge declines. The iPod and iPhone are what rescued the company, and in hindsight, the Apple II and Mac classic were basically also-rans. They mostly failed from a marketshare and software library perspective, that managed to extract a premium in niche markets, but so did many workstation vendors.


> B&W 9" 512x384 vs Color Commodore 1080S 13" @ 640x256 or 640x512 interlaced and doubled as a great TV. Let's say you didn't want to use interlaced mode

Having used both machines, as well as an interlaced 1024x768 SVGA, I assure you, I do not want to use interlaced mode.

> It was years before MacOS even added cooperative multitasking (System7).

Historical note: multitasking was introduced by MultiFinder in System 5. The ability to load and switch between multiple apps at the same time was available with an add on (Switcher) in early 1985. Where the Amiga was particularly strong was the fact it had preemptive multitasking and (IIRC) much more dynamic memory management than Apple's fixed memory blocks allocated at application start time. Even Windows didn't catch up with some of that until Windows 95 (which was somewhat hobbled by the 16-bit mutex.)

> The Mac IIcx for example, was introduced in 1991 for $10,000 in 2018 inflation adjusted dollars ($5000 in 1991). You really think the IIcx was 3-4 times better than a PC in 1991 or an Amiga 3000/4000?

The IIcx was really an early 1989 machine, which made it a contemporary of machines like this $7,500 Compaq Deskpro 386/20. (IBM PS/2's were similarly priced.)

https://www.nytimes.com/1988/01/10/business/the-executive-co...

> you're talking about a market size of a few hundred thousand machines.

That's a rather influential few hundred thousand.

> Most business was stilling being conducted on timeshare machines at this time.

(A lot still is, to the extent cloud instances are time shared. :-))

> a Laserwriter

Well the Laserwriter really opened up an entirely new product category. It was expensive, but, for a time, it was the only game in town at anything close to consumer level pricing. As much as you've correctly pointed out that education was one of Apple's strong points, the LaserWriter made DTP another strength.

As you point out, this is analogous to specific hardware making the Amiga particularly strong in Video production. (The Amiga chipset itself, as well as the Video Toaster.)

> with a setup costing 3 times as much, you could print some flyers.

Flyers may sound trivial now, but back then it was a big deal and a huge source of demand, and even now, the need for document preparation far outstrips the need for video production.

> I was typesetting documents with TeX in 1991 producing output far superior than what I saw coming out of professional Mac apps, and with far less intervention,

Anybody that's read the line from Knuth on the joy of going back over an almost completed document to insert the proper italic corrections, etc. should recognize that TeX is... well... not for everybody.

TeX is an amazing intellectual achievement, it produces great results (particularly when math is involved), and I personally use it to this day.... but it's vastly overkill for most documents and far beyond what the vast majority want to use or have learned to use. (A good friend of mine in college, while studying Computer Science and writing a compiler, once remarked: "I don't want to have to compile my term papers!", when I tried to advocate he use TeX. I honestly don't blame him... 'overfull hbox, badness 10000'?!?!?!)

On the other end of the spectrum, there's a lot to be said for the original Mac's ability to highlight text on the screen, select 18 point Geneva (or whavever) and then have the ImageWriter produce a low quality but accurate rendition of what appeared on the screen. As backward as the Mac's video model might have been in contrast to the Amiga, it offered a useful capability in that it was consistent with the model its printers used too.

> you'd be better served by something 80% as good, but 1/2 the price.

Not if you needed (or even just wanted) that last 20%.


You both seem to be enjoying the back and forth, so 'scuse me butting in :)

> IIRC) much more dynamic memory management than Apple's

The Amiga scatter loader that both dynamically loaded and minimised fragmentation. If you were doing it "properly" and staying in OS, ie not hitting the hardware, there was one fixed address: ExecBase ($4) via which you'd find GfxBase, IntuitionBase and so on.

> Having used both machines, as well as an interlaced 1024x768 SVGA, I assure you, I do not want to use interlaced mode.

No one would. You bought either the superb 2080 Paperwhite long persistence monitor or a flicker fixer and multi sync.

So many missed opportunities by Commodore. :(


> You both seem to be enjoying the back and forth,

Speaking for myself, yes. :-)

Something I should point out here is that I grew up with reasonable access to both machines: Apple ][s and C64's. (My elementary school had a lab for each, across the hall from each other.) My exposure to the C64 was through a combination of my school's being in a Logo educational pilot program (based on Seymour Papert's work) and my mother running the C64 lab. That gave me some formal introduction to Logo in class, and then a bunch of after hours time, where I got a chance to explore further, including Commodore BASIC, setting the hardware up, etc.

This probably tends to bias my experience with the platform away from what people might consider its major strengths: the SID and the VIC II. Because I wasn't a personal machines (they were at the school), it also meant that I wasn't exactly buying things like extension BASICs and fast disk loaders.... so I'm biased towards feeling a bit more of the pain of the limited BASIC and slow disk interface. I still remember the disappointment trying to get graphics on the C64 screen in BASIC, given how easy it was in C64 Logo and in the Applesoft BASIC on the Apple ]['s across the hall.

I also tend, probably mostly thanks to my high school Physics teachers, to be more sympathetic towards the Apple ][ platform. While there's no doubt the C64 has better graphics and sounds (at least for games), I also saw some amazing things done with Apple ][ machines into the early 90's. My middle school had one with a Corvus hard disk and a barcode scanner that ran the library. My shop teacher had one with a plotter and a CAD system. My band directors used them to organize their operations. My physics teachers had an elaborate system for custom test generation and real-time grading, in addition to basic Sonar hardware (from a Polaroid camera) for supporting student experiments. So... while I'm very personally sympathetic to the Commodore 64, I do take objection to the notion that the C64 was uniformly 'far more powerful' than the Apple. (Which was why I made my initial response, in fact.)

Whatever additional power there is in the C64 can probably be explained by the fact that it's five years newer... and I still saw relatively standard Apple's doing things that still seem outside the capabilities of the C64.

Another way to look at this is that if you'd asked me then which platform I'd have chosen, there's not much doubt I would picked the Commodore. If you sent the current me back in time and then asked me to pick, I'd probably go with the Apple. As much as I like the built in hardware on the C64, I'd be hard pressed to walk away from Apple features like expansion slots, the faster disk, the 80 column display, AppleWorks, the standard way to get to 128K, etc.

This is this kind of thought process that has me objecting to the thesis that Apple has achieved its success solely by selling overpriced crap to customers deluded by the Reality Distortion Field. Even if it didn't spawn a demo scene, even if people aren't chasing the beam with cycle accurate 6510 code, the unique features the Apple did have seem more useful to me than any of that.

> So many missed opportunities by Commodore.

Agreed, without a doubt. (Starting with the 1541 disk performance, the keyboard on the PET, and the environmental issues at the MOS plant in PA. It's 2018, and there are still people worried about the water quality around the plant.)


Surprisingly similar story to my own, despite being the other side of the Atlantic and different hardware. School I went to was also piloting IT in schools, so we had a selection of 8 bit kit and an after hours club that was hugely popular.

My memory of Apple II is the expansion slots helped it stay relevant longer and it attracted a huge range of weird and wonderful add ons as a result. C64 was far more price sensitive which showed in some of the choices. Were it much more expensive it would have lost to one of the dozens of other 8 bits.

> This probably tends to bias my experience with the platform away from what people might consider its major strengths

My Amiga bias is less the amazing hardware, as that's always going to become outdated at some point, even though it was astonishing for its day. I actually feel we're all a bit poorer as a result of DOS+Windows and would be in a much better shape had we started with some of the Amiga and Mac OS choices and innovations accepted as the usual way of doing things.

Case in point in the OP: Mac's Applescript, or the earlier Amiga ARexx, rather than batch files and a big fat nothing in Windows.

I spent years finding every innovation on Windows had someone doing it far better a decade previously. Usually with an order of magnitude less code. Windows seemed comically lacking and gratuitously obnoxious to write code for. So of course that platform won for general computing.

Hindsight is easy of course.

> selling overpriced crap to customers deluded by the Reality Distortion Field

Macs were objectively better at so many things but lacked colour. In truth I always felt the reality distortion was on the Windows side. Much later Apple became a little too much focused on style but that's another story. :)

> and the environmental issues at the MOS plant in PA

Didn't know of that. That's the third reference to current environmental issues from decades closed plants I've come across in about as many weeks! I just read The Radium Girls about Undark and the worker's horrific injuries and struggle to get compensation or even acknowledgement. The two companies denied, lied, and cheated their way through. Both sites, and even where factory trash and rubble from demolished buildings was sent(!), were found highly hazardous decades later and required huge cleanups at EPA expense. IIRC one cleanup is still ongoing. I wish humanity could be encouraged to be more considerate of our planet.


> I actually feel we're all a bit poorer as a result of DOS+Windows....

Agreed... I tend to think MS did a very good job incorporating a grab bag of ideas into their products, but a very poor job doing the work to really make it all fit together.

Just one recent example of many is this bit on getting a status bar display of cursor location working in Notepad:

https://blogs.msdn.microsoft.com/oldnewthing/20180521-00/?p=...

tl;dr - The edit control, around for over thirty years, has no way to listen for cursor movement events. So they (meaning Microsoft itself) had to work around it by using a hack built around the accessibility API's...

> Didn't know of that.

The only reason I knew it is that years ago, after moving from Houston Philly, one of my first consulting clients was down the street from a shuttered 'GMT Microelectronics' factory (and some of my colleagues mentioned something about a water contamination issue). I looked a bit further into it and it turned out that 'GMT Microelectronics' was the successor of MOS Technologies, and the plant had been shut down due to leakage of basement chemical tanks into the water table.

Ironic that I wound up down the street from the remains of the company that had made such an impact on my early life and eventual career.

To close the Apple/Commodore link, it was also engineers behind the Commodore SID that spun off another local area company - Ensoniq. Apple then chose an Ensoniq sound chip for the IIgs. There is thus a very direct lineage between the audio capabilities of the C64 and those of the IIgs.

(And in the PC space, Tseng Labs of ET4000 fame was also local... and there's also Eniac, and the Eckert-Maunchly Computer Corporation lineage that wound up as Unisys. There really is a lot of local computer history around here.)


Nobody is denying that Apple made it while Commodore failed.

The question is: Was AmigaOS a "major" operating system that should be considered here. And I would say: Yes, during its time it was:

- It had millions of users, at least some percentage of them serious ones.

- It was used heavily in the TV industry

- For a long time it had the biggest online archive of public domain software and shareware (Aminet) of all contemporary platforms.

- Many of today's high-profile game developers started their career on the Amiga

and, back on-topic:

- It had something similar to AppleScript several years before the Mac. ARexx was available from 1987 and was shipped with the OS from 1990 on. Most commercial productivity software had an "ARexx port" and was scriptable in a uniform, system-wide manner.

On the other hand, systems like Smalltalk or the Lisp Machines, though way more powerful, were never really mainstream.


> Nobody is denying that Apple made it while Commodore failed

Sure... my main point is that it's overstating the case a bit to say that the C64 was strictly more powerful than the Apple ][ series. There were technical aspects of the Apple ][ series that were usefully more capable than the C64.


If you include the the Apple II "series", then you should include the C128/C128D. That nullifies the only advantage non-IIgs models had, which was more RAM and 80 column mode. It also sold more than all Apple II models combined by itself.

Other than that, what were the technical aspects of the Apple II (non-GS) that we should be aware of? The IIgs was a very nice machine, but it had to contend with the Amiga which easily outclassed it.


> then you should include the C128/C128D.

Where do the Plus-4 and C16 fit in? (Not to mention all of the 'Secret Weapons of Commodore' that they didn't manage to actually ship? As much as you talk about Apple hobbling the IIgs, it's not like they were the only company with unfulfilled potential from their engineering.)

(I did rather like the SX-64, though.)

> Other than that, what were the technical aspects of the Apple II (non-GS) that we should be aware of?

The machines could do some rather amazing things... by the time the early 90's rolled around, my high school Physics teachers had them generating custom tests for each student with integrated graphics for drawings, grading the tests in real time with an integrated scantron machine, and keeping all the class records on disk.

They got a lot of mileage out of some features made easier on the Apple. If I remember right, they had 1MB of RAM they were using and the test scanner plugged into an expansion card. They also swapped out the code to to disk to generate each question and draw the graphics... as they were generating the individualized tests for each student. For a 20 question test on a class of 20 that meant 400 disk read operations, so the faster disk I/O would've been useful to overall throughput. (They hacked a way to do partial loads of BASIC code into Applesoft by modifying the linked lists the interpreter used to maintain lines.) And I'm pretty sure they used a mouse and (80-col) double-hires mode to draw the graphics for the tests.

I'm not saying a C64 couldn't have done it, but the SID and the VIC II wouldn't have helped any, and there are a number of Apple ][ capabilities that did...


It was, in the day (and market).


Perhaps it depends on how you define "major", but Smalltalk and the Lisp Machine come to mind.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: