Hacker News new | past | comments | ask | show | jobs | submit | eschaton's comments login

“I hope some false humility makes the attention being currently given to the awfulness of my beliefs go away, that kind of attention makes it much harder to spread them covertly.”

It’s not like OS-level features and improvements aren’t in every release—Apple’s been talking about them since WWDC in developer channels. Read the release notes.

It wasn’t even to push people to the military, but literally to ensure “the wrong people” had more difficulty getting a university education—Reagan’s people came up with it as a reaction to Vietnam War protests.

They were; their main point of connection was at the OS level, as Accent on PERQ begat Mach, and Andrew originally ran atop CMU’s Mach+BSD environment (MK+UX). That let it take advantage of features such as Mach IPC and dynamic loading of shared libraries and plug-in modules. Later Andrew was ported to vendor operating systems, and then to run atop X11 instead of wm.

System 6 isn’t substantially different from System 1 when it comes to type though, unless you install the Adobe Type Manager or TrueType extensions of course.


Time for them to release the source code then.


Releasing source code (especially under a permissive license) would be extremely hard, even more so with software that old. There could be small 3rd party modules buried in the code base whose original developer is impossible to find, or could have passed years ago and they have no ways to contact anyone who owns the rights, let alone have everyone of them agree on open sourcing and under which license. There would be a fairly big chance of liability, and I couldn't blame them for not wanting to test that. Software should be open from the beginning.


They could just not release the code they don't have the rights to, and release the code they do have rights to


What would that achieve? Why would anyone want a pile of old code that can never build or run?


Code has more value than if it can be ran or not. Its 4 decades of problems and their solutions. For anyone who wants to do any work in the music notation space, it could be quite invaluable to go through the lessons learned, to see things from another perspective, especially one that went all the way to production and a long period of commercial viability.


That is theoretically possible, just seems very very unlikely. Finale is “millions” of lines of code, old legacy code that spanned dozens of OSes. Have you ever tried to read huge legacy codebases? They are hard to read, and I’m dramatically understating it. The time it would take to read and extract lessons learned from it could exceed the time available to do any work in the music notation space. Most of the lessons that are there to extract are lessons that no longer apply to anything. Not being able to build & run the code would reduce the ability to understand it by another multiple factor. It would be far more efficient to hunt down and interview the Finale devs, or spend time working on another product and learn from it.

All that is beside the point that Finale devs are under zero obligation to release their code, and generally speaking they have a decent list of reasons not to, plus some specific ones I speculate.


Somebody would be able to replace the unreleased portions with new code.


It’s a nice thought, just extremely unlikely, no? Unlikely that someone has the time to deal with a huge legacy system, and unlikely they’ll be able to rewrite portions and get it working. There are very good reasons this hardly ever happens, releasing proprietary code, even when it’s all modern and working. The potential downsides are usually bigger than the upsides.


It has happened with OpenJDK, first downstream with the IcedTea distribution, and then gradually things were replaced upstream or opened by Oracle. I think today, only the browser plugin is missing, and nobody really wants that anymore.

It's rare that this happens in the open like this. I expect that it was a factor that OpenJDK was a free software development tool, so Sun already had transferrable licenses from their suppliers. For other types of software, building new software with it is not a consideration from the outset, and licensing agreements with third-party component suppliers will reflect that.


Well, he adapted Clascal & Object Pascal from Lisa & Macintosh to the DOS/Windows PC world, and added a couple features from CLOS to turn it into Delphi, and married that with Sun’s C++-syntax bytecode-compiled variant of Objective-C to produce C#.

He certainly deserves credit for what he did, but not what those whose shoulders he stood on did.


Actually to produce J++, followed by a lawsuit, which made cool from MSR become C#, and J# come into existence to ease the porting from J++ code into C#.

Ironically 20 years later, Microsoft is again a Java vendor, and OpenJDK contributor.


A baseline Mac wasn’t any more expensive than the competition (IBM), it just wasn’t inexpensive like Commodore, Atari, and PC clones. Apple also had essentially the same price structure for decades, while PC clones raced to the bottom on both price and quality.

And the Mac II was actually priced better than most competitive systems—because those competitive systems were 16MHz 68020/68881-based workstations from Apollo, HP, Tektronix, Sun, et al. In early 1987, a name-brand 16MHz 80386 system with 80387 was comparably priced, which is why most people buying PC clones didn’t get a 386 until 1990-91 or so, around when the 80486 (and 68040) came out.


"wasn’t any more expensive than the competition (IBM)"

In other words: no more expensive than the other thing nobody ever bought? The success of the PC did not lie in IBM selling large numbers.


You used Apple Pascal on the Apple II, not the Mac, and it wasn’t actually free but a couple hundred dollars a seat. It was based on UCSD Pascal, as were many implementations at the time, but it was a commercial product and one used by many Apple II developers. Your school either licensed or pirated it for you to use.

Apple didn’t even ship self-hosted assembly tools for the Mac until the Macintosh Development System later in 1984, and when Apple did ship Macintosh Pascal it was a learning environment with a (non-UCSD) bytecode interpreter rather than a native compiler with Toolbox access. That was still something most people used a Lisa for until after both the Mac 512 and the HD20 came out.


I definitely used ucsd pascal version II (not apple pascal, which came a bit later) on Apple II. Looking at this source[1], it looks like it both existed for Apple II and wasn't free so I must have assumed that incorrectly.

I also realise I was talking about Apple II not mac. My assumption/point was that the market for apple pascal devs had been captured previously by the "power" of the UCSD p-system before the mac came along so borland figured they didn't have a chance.

Personally I wasn't that much of a fan of ucsd pascal vs turbo pascal for reasons I can't remotely remember. I think ucsd pascal you could only do things in the "p-system" bytecode thing which meant it had a slightly more restrictive/pure pascal variant vs turbo pascal had some language extensions like being able to do dynamic memory allocation so you could make trees and linked lists and stuff that iirc you could'nt do very easily in vanilla/ucsd pascal. It's been a while so I may be misremembering.

[1] https://archive.org/details/byte-magazine-1982-08/page/n193/... 15yo me didn't realise that using UCSD pascal made me one of the elite. My whole life may have gone differently had I known that.


Borland would definitely have had a chance with Turbo Pascal on the Apple II (if there had actually been a 6502 version available), for a couple of reasons:

- Speed

- Size

Speed: I used Apple Pascal as well as Turbo Pascal for the same purposes (steering satellite dishes, and also multi-tasking data collection) on dual-CPU Apple II clones (6502, z80). Using Turbo Pascal was a different world w.r.t. speed - way, way faster.

Size: When I developed my multi-tasking data collection system in Apple Pascal I had to use four floppy disk drives, set up for "swapping" (the UCSD/Apple Pascal system had that ability, it could segment itself) simply so that there would be a tiny bit of RAM available for the Apple Pascal editor. No such problem when using Turbo Pascal on the z80 system, with equal amounts of RAM.

When that's said, UCSD Pascal and Turbo Pascal weren't that dissimilar as far as Pascals were concerned - Wirth's Pascal wasn't very practical, so every useful Pascal version had their own extensions. UCSD and Turbo had some commonality there which made it easy to port between them.


IIRC the Wizardry games were all written in UCSD Pascal. I wouldn’t be surprised if a Mac port existed, it got ported to damn near every platform out there.


The difference is that there were 128MB quad-CPU Apollo DN10000 systems purchased, delivered, and installed at customer sites.

They were used for things like VLSI design and simulation, so a US$250000 system was actually worthwhile—especially since it and other Apollo systems could share their resources with each other on the local network.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: