Hacker News new | past | comments | ask | show | jobs | submit login
Apple's First Public Demonstration of the Mac, Unseen Since 1984 [video] (time.com)
173 points by technologizer on Jan 26, 2014 | hide | past | favorite | 87 comments



This is a full-length, 1 hour 36 minute demo.

Paint demo is around 42 minute. At one point when Paint was demonstrated I heard a background noise "that's incredible". That's how incredible and how shocking people found when computer was no longer a terminal, but with a mouse and a control menu.

The Mac Writer (around 51 minute) was also very simple yet efficient. For example, once the writer entered the format mode, the options for line space is right on the top and easy to click. Then I thought "geesh, we can't even fit anything on the screen with all the buttons in Word unless we hide them into a drop-down list." I always have trouble locating the line space option in OpenOffice or Word. Having used Word 2003, I rather go back to 2003 and stay that way.

What have we done to our software? The tools we use are becoming so complex.

History note: being the first Macintosh, this is an expensive equipment. The sale didn't go very well because it was very expensive...

Here is the teardown of the 128K Mac; http://www.ifixit.com/Teardown/Macintosh+128K+Teardown/21422


I liked how they reacted to the guy asking questions about Mac Writer. It showed both what the mentality and expectations of software at the time were and what a step forward compared to those Macintosh was.

He first asked if it is possible to center only part of the document instead of the whole document as demoed and then he asked whether select-and-replace worked only for replacing equally lengthy text. Both of those questions were followed by laughter at – what I assume – was the demo that both of that was no problem on the screen.


Whenever I see those classic demos like Engelbart's in the 60ies or the first Mac, I feel a bit ashamed for our profession. Computer Software really doesn't seem to have developed much in the last 50 years. We have turned working academic prototypes and concepts from back then into products, not even to the full extent they were perceived. Other than that everything seems to have turned in circles. The minicomputer, the PC, the LAN, the Web, mobile devices - it all brought cool new hardware capabilities, but when it comes to software it was just reinventing the wheel again. The hardware has usually dictated the way the device is to be operated instead of the other way round. Mac and iPhone have been notable exceptions to this.


While I agree with you, we should view it this way: we started out with nothing and so any progress we made back then would been incredible. Now we have so much progress we are just looking at how to fine tune things and so there aren't much surprising, revolutionary ideas out there. And because there are so many tools to turn people's idea into reality (which increases competitions), there is little room for surprise. The truly surprising technology that we start utilizing and making big progress would be 3D printing. It has been around for many years but now we finally start using it...


> What have we done to our software? The tools we use are becoming so complex.

You'd like GNOME. They remove a feature every release.


At the ~3 minute mark, Jobs says, "Apple and IBM emerge as the industry's strongest competitors. [...] It appears IBM wants it all. Apple is perceived to be the only hope. Dealers initially welcoming to IBM now fear an IBM-dominated future. They're increasingly turning back to Apple as the only force that can ensure their future freedom."

It's interesting how the future played out: users have rejected the notion that freedom is valuable. Perhaps something else was meant by "freedom" in those days, but it's hard to imagine something less free than the app store.


"it's hard to imagine something less free than the app store"

I would argue that if you are not safe you are not free.

The vast majority of App Store users are non-technical and therefore open to scams, viruses and malware. The walled garden may take some freedoms away from the highly-technical but it pays back freedom in spades to 'normal' people. (plus the highly technical can always choose to leave the walled garden if the want to either by jailbreaking or by buying an open device).

We geeks have only ourselves to blame — if we won't police ourselves someone else will have to police us. We gave these freedoms away by using our talents to cheat and scam our customers.

We're not the first to need it of course, if it wan't for Victorian scammers using suspect weights & measures and adulterating flour with dust and the like there's be no need for Trading Standards Officers.

Trading Standards Officers give us all the freedom to buy food & drink safely and I don't believe anyone would truly see us return to a world without them.


> I would argue that if you are not safe you are not free. The vast majority of App Store users are non-technical and therefore open to scams, viruses and malware.

That would make great sense, if there was a way for technical users to opt out of the walled garden. But it's the very opposite -- Apple spends a lot of time and money making it harder to jailbreak the phone, in ways that have nothing to do with defending against malware. If they do their best to make it impossible to opt out, then it's not about protecting your safety. It's about protecting the business model.

There's an argument for protecting the business model, of course -- making jailbreaking more difficult also makes piracy more difficult. The App Store has funneled a ton of money to developers, which in turn makes for better apps for everyone. But the notion that Apple has gone to absurd lengths to prevent jailbreaking because ... "We [developers] gave these freedoms away by using our talents to cheat and scam our customers"? No. That's not how freedom works. This is about money, plain and simple.


To play devil's advocate, do you think in an alternative universe where Apple products have built-in easy to use native support for jailbreaking with the push of a button that some large malware outbreak that only affects those devices jailbroken devices won't be blamed on them anyway?

Apple has other motivations to maintain their walled garden, but their goals of controlling the whole experience also extend to things like that. They want there to be no question that any Apple device is "safe", regardless of legitimate user configuration.

And as long as users at large aren't demanding general purpose computers who can really blame them?


Well, we have millions and millions of people using Windows, and in my experience when viruses and malware affect them, they usually blame themselves more than Microsoft. So, no, I don't think they'd put the blame in Apple.

And if that would have been the case, remember the Antennagate? First they told their customers to hold the phone differently, then they said it was a software problem, and finally they said the problem was the antenna, but that all smartphones had a similar problem (which was a lie). So, in that case I would have expected Steve Jobs' reality distortion field in full power blaming everyone except Apple.


    > they usually blame themselves more than Microsoft.
Did you not see how back with the Mac/PC ads Apple very much played on the whole aspect of Windows PCs being vulnerable to viruses? https://www.youtube.com/watch?v=M3Z386vXrt4

    > remember the Antennagate
I'm not saying Apple isn't capable of PR bullshit. I'm just saying that I can see why they don't see the benefit of allowing any kind of fragmentation on their platform.


I would feel more safe buying second hand iPhone than I would feel when buying an Android. Sure, the iPhone can also be compromised, but it is less likely.

In any case, it's not Apple's responsibility to make open hardware, since there already are open alternatives. It's not economically viable for a company to support non-inteded purposes of use.


> Apple spends a lot of time and money making it harder to jailbreak the phone, in ways that have nothing to do with defending against malware.

I doubt the point matters much, but for the specific case of preventing iOS jailbreaking (as opposed to, say, the iPodhash crap), Apple seems to be pretty lax when malware isn't being enabled. For example, after a jailbreak is released, Apple often doesn't patch the vulnerabilities until an update is rolled out for some separate reason - with the exception of the jailbreaks that remotely exploited the device via Safari (as opposed to requiring physical access to the device), which were patched quickly.

Of course, I would prefer if it were not necessary to use a vulnerability in the first place.


>I would argue that if you are not safe you are not free.

Sounds like every politician who tries to justify the current methods of the wars against terrorism, against drugs, against pedophilia and the general move towards censorship and surveillance.

Look, safety is good, but if you believe in freedom, you should leave the choice. Just like the root user in Linux and Android, I should be able to unlock superuser abilities in iOS.

The same applies in real life. There should indeed be certifications for food and drinks, but it's not a reason to remove rat poison or bleach and ammonia from supermarkets. Just put the proper notices and safeties.

Choice is key.


How about their choice to develop the software how they like, and your choice whether to buy it or not!

They are not a monopoly. Other than patents they haven't been given any special privlidges to get where they are today. As long as YOU as a consumer do in fact have a choice, and you DO, you don't get to tell Apple how to build their products.

I think Apple is keenly aware of this, and they don't want to be a monopoly more than anyone, because having a minority market share gives them the freedom to bundle and integrate software which they otherwise would be prohibited from doing.


An "App Store" is not a requirement for having a sandboxed environment.

Consider these examples:

1. Your web browser. 2. Java and .NET programs can be run in different trust-modes only allowing certain functions from the standard library, and obviously disallowing external libraries then. This is mostly used on servers to isolate websites on the same server today but if given a better user interface could have been used to secure desktop programs. 3. User permission levels in operating systems


> They who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.

- Benjamin Franklin

With a population lacking this mindset, no wonder this country is going to the dogs.


Never mind the dubiousness of equating freedom to modify a particular gadget with fundamental civil liberties (there is certainly an analogy, but the parent referred to the former and you leapt at the latter)... the parent doesn't even live in the United States.


the parent doesn't even live in the United States

Nor did Benjamin Franklin for a large part of his life, too :)

I won't ask how you know…


I checked your post history after guessing based on the phrase "Trading Standards Officers". :)


  "Only a virtuous people are capable of freedom.
  As nations become corrupt and vicious,
  they have more need of masters."
– Benjamin Franklin


What a facile view of the world.

EDIT: Not Franklin's!


It's clear Steve meant "freedom" in the sense of freedom of choice. If the industry were dominated by a single vendor, you have no freedom - you have to buy what that vendor is selling.

Steve got the story right, but the characters and timeline wrong. It wasn't IBM in the 80s, but Microsoft in the late 90s that threatened to dominate personal computing.

I remember these as rather dark times. Microsoft engaged in shady practices like per-processor licensing, which required OEMs to pay Microsoft for every machine sold, even if it didn't have Windows installed. Upstarts like Be couldn't get any traction, even with their technically superior OS. It all culminated in the US vs. Microsoft antitrust trial (which ended up going nowhere).

During this period, Apple was very much on the ropes ("beleaguered", they said), but Mac OS nevertheless remained a viable option for the average consumer - the only real alternative to Windows. In this way, Apple provided users with a mote of freedom.


MS pulled some shady shit, no doubt about it, but I don't think MS has ever been the serious threat to freedom that other companies have. The PC has always been an incredibly open platform. That openness was taken advantage of by many competing operating systems, especially linux, over the years. And remember that the "PC" in this case extends to the server market as well. It's always been possible to build PCs from parts and put whatever OS you wanted on them, and MS has always been a part of that phenomenon while Apple and other companies preferred much more closed ecosystems. We should remember that, and give credit where it's due.


In the late nineties, some of the Linux distributions were also already viable. E.g. I remember introducing SUSE Linux at my high school in ~1998, and back then it could be installed by a reasonably technical person. Which was not that much different than Windows 9x, given all the work that was usually required to install device drivers, etc.

We also had some desktops with KDE and most students had no trouble doing their usual work (web browsing, text processing, e-mailing, etc.).

I wonder if things improved that much. Yes, we now have Android and iOS, which have a limited scope (portable devices). But, on the desktop and laptop it's still Windows. Mac is only an option for the rich contingent of Western countries. Meanwhile, the Linux desktop has splintered into a market where there are no clear market leaders anymore - Red Hat abandoned the desktop, SUSE is not really visible anymore, and Canonical is working hard to get rid of its user community. That, plus a half a dozen desktops (GNOME, Unity, Mate, Cinnamon, KDE, Xfce). Frankly, it's a mess! However, what's perhaps more damaging is that 'the hope' is gone. There was a time where many people believed that the Linux desktop would eventually take over. Now nobody believes that there will ever be the year of Linux on the desktop and a large number of developers has left to OS X.

The only spark of hope are web applications, since they are platform independent and there is usually some competition. However, you never purchase or own an actual license. So, the vendor could go bankrupt, cancel development, remove features that are crucial, etc. You can never make the decision to stick with an older version, since you don't own a license. Moreover, usually you get a backup of your data in any easy way, let alone convert your data to a web app of a competitor.

tl;dr: mobile devices are not good computers yet, on general purpose machines one of the best competitors fragmented itself into oblivion, in platform-independent web apps you are dependent on the whims of the vendor and locked-in.

To end on a positive note: at least Android is mostly open and captured most of the smartphone market and will do the same on tablets. It's really good to see the work of the Cyanogenmod community, etc.

Edit: Downvoters: Is it because you believe that the Linux desktop hasn't fragmented? Do you disagree on the downsides of web applications?


"In the late nineties, some of the Linux distributions were also already viable. E.g. I remember introducing SUSE Linux at my high school in ~1998, and back then it could be installed by a reasonably technical person. Which was not that much different than Windows 9x, given all the work that was usually required to install device drivers, etc."

I strongly disagree with this.

In 1998 Linux was not easy to use. Sure, KDE, etc, was nice, but to get to a minimally workable system it was hard.

I know, I did that, coincidentally in 1998 as well.

Getting the video drivers to work, very difficult. Find the right driver, install, etc

Configuring Dialup, hard. Using dialup, hard as well.

After that, sure, it was nice to startup KDE and browse the web.

Linux, as a "unix clone" was good if you had standard hardware and a permanent configuration. Then you can justify setting up wvdial like it was (and probably is today)


>Linux was not easy to use

True, it took me 5 days to download MKLinux correctly via dial-up. But it worked, went back to OS9 but soon I had build a phalanx of PCs as servers and for friends, and while I used a Mac as my main computer (still to this day) I fell in love with Linux. That's a pretty powerful experience for a girl growing up on the edge of civilization. It's thanks to Linux that I consider myself a child akin to Stephenson's Diamond Age, but thanks to the Mac's friendly introduction to computing that I get could started.


It worked better probably because of the much smaller hardware variation in Macintoshes. Also, it was sponsored by Apple

And for servers it worked great, the problem was the "user friendly" part of it.


Right - if MKLinux hadn't of run, I might have discarded any interest in Linux for a few years, not learnt how to build my own PC boxes and stuck with Redhat (I think back then). After a successful initial experience I was ok dealing with bad display drivers et al. That's kind of my point, why I'm really glad I started with Mac: I had no-one around to guide me and powered with Mosaic (soon Netscape) and the growing access to mailing lists, I was on my way. I got more than a few non-techy friends running Linux shortly thereafter, given I could build them a machine cheaper than at the store and most of us went on to have successful technical careers in some form or another (whether in the Arts or architecture or whatever). That period of 96-2000 was, for me, made possible by the convergence of a few wonderful things that for the first time in history made it possible for an otherwise non-educated person in technology to access, and help make, the world.

I had looked at taking a few CS classes at university but the overwhelming perceived hostility towards women, or anyone not clearly 'belonging' at my science department at the time kept me away. So you can imagine just how much I love the Internet, and the Mac, and Linux. Times have changed and that's cool - I've run some coding classes since then primarily for women and everything they need is at their fingertips and my university's CS and Science department is far modernized and getting a healthy balance of women and minorities. I'm very aware that my jump into CS was made possible by a very special convergence, freedom, and surge of information.


Came here to say just what you did. From 1998-200+ I installed literally hundreds of different linux OSes hoping that "this one would be the one". Sadly I spent even more countless hours finding and debugging drivers and generally failing at my attempt to find something to replace or even run alongside Windows (every geek I knew duel-booted another OS).

So yes on paper Linux had/has everything Windows does but in practice it was another story. It arguably still remains the case even today which is shocking considering how longs it's been.

Most of it is related to drivers and hardware industry support which will never happen until the other dominant players (Windows, OSX) stumble.


I had no problems installing Linux since 1993, except the hardware was problematic (not supported).

In the early days, I did fine with VESA support. Then, I actively chose hardware that I knew it would work basically out of the box. I had minor problems, never a showstopper. It could be a bit cumbersome at times, that's true.

Nowadays it's a lot easier to install Linux in most computers than Windows. If you want to install a version of Windows that isn't specifically supported by the hardware, you're SOL.

Right now, the biggest problem is GPU support from AMD/ATI and nVidia, which happen to be the leaders when it comes to high-end graphics. You're either left with a partially-working Open Source solution, or with an often glitchy proprietary solution. Intel works fine but it's just for the lower end.


> Nowadays it's a lot easier to install Linux in most computers than Windows. If you want to install a version of Windows that isn't specifically supported by the hardware, you're SOL.

Please substantiate this. Aside from the Macintosh EFI mess, Windows will install on pretty much anything and has fallback drivers for just about everything. Its backwards support is impressive, too; my one single-boot machine is a Win7 Thinkpad A22m with a Pentium III in it.


I've had a number of laptops with buggy unsupported drivers for XP. I don't have any of these here with me right now but this has happened to me with a number of relatively common models from HP and Dell, when installing XP over Vista (can't recall if Win 7 was available at the time). The company had not moved their support package images to Vista, for compatibility and security reasons - telecom company with funky email servers and a policy of installing as old and boring software as possible.

There simply was no support. The solution HP, Dell and MS would give is to "get on with the times and migrate" which wasn't possible at the time. Alternatively, buy another computer with specific XP support. Complaints were met with a "RTF EULA" where it stated that no support was guaranteed for other OS than the preinstalled one.

The solution was to buy new laptops with WinXP support (ended up costing more, since the company specifically needed serial ports and not many machines had them + WinXP support, limiting choice).

Ubuntu installed flawlessly with absolutely every feature the hardware supported.(This happened when we basically had to either find some new usage for the machines or throw them away). Much quicker than XP too, and objectively simpler as it would choose reasonable defaults instead of asking esoteric questions, making us call phone numbers or insisting in us registering anything. The single thing XP has on Ubuntu or Mint regarding installation, is that you often don't have to do it yourself. In some respects it's worse with newer MS systems as their DRM has "intensified" and OEMs ship more resilient bloatware/rootkits.

Small division within biggish company, so the process was followed by some of us, including me although I wasn't in IT.


Ubuntu installed flawlessly with absolutely every feature the hardware supported.

But that's not a really fair comparison. Windows Vista was GA in 2007. You wanted to install a 2001 operating system on >= 2007 hardware. If you used a 2001 Linux distribution on 2007 hardware, it wouldn't work either. Quite possibly even worse than Windows XP.

Your parent was arguing in the other direction. E.g. Windows 7 will install fine on most pre-Windows 7 hardware. In fact, on my last to non-Macs I didn't have to fetch any drivers to install Windows 7 or Windows 8. It all worked out of the box.

Of course, Linux is even better with respect to backwards compatibility. A light distribution, such as Slackware, will probably work fine on a 1998-era Pentium II. Both in that most of the hardware will still be supported and that it will run with an acceptable speed. The same won't be true for Windows 7 or 8.


At the time Windows XP was still supposedly supported.


Not sure about the downvotes, but talking about the influence of who buys personal computing devices, almost no-one cares about the above. Cloud stuff disappears? You open up somewhere else and mobile devices are fine computers/computing devices. Almost no-one who is non-tech is buying laptops. And those are the same people who don't care about 'your data', data privacy and all that. They read about it in the paper, have no clue how it affects them and then start up Facebook and forget about it. And that's 30+ers. Kids < 20 really don't care about any of that. My cousins drop their tablet while playing and just open another Gmail account because they cannot be bothered to write down the password. And they do everything on those things.

Unless you're talking CAD/CAM, game dev, 3D modelling, movie/music editing, sci computing, people are simply not buying laptops proving mobile devices are great computers for 99% of the world. Oh yeah; they don't care about the OS either; they care about what 'others' in their situation have; they buy what their friends have or what looks good when sitting at the hairdresser.


My favorite is the MS OS/2 2.0 fiasco, and keep in mind that OS/2 did not depend on DOS at all.


"Freedom" is just a marketing element for Apple when they were still an underdog in the market. Jobs was never a real proponent of Freedom in computing anyway. The Mac had even its proprietary connectors at the time, and did not use any of the standards available. You were "free" to buy your peripherals only at official Mac resellers.


The Mini-DIN (the main connector on moist early Macs) wasn't proprietary. Isn't it still what PCs use to connect the mouse & keyboard?

http://en.wikipedia.org/wiki/Mini-DIN_connector

Even the 128k Mac had mostly standards-based connectors.

Mouse port, 1 eight-bit keyboard bus, 300 baud, RJ11 connector for the Macintosh Keyboard. Two RS-232/RS-422 serial ports, 230.4K baud maximum, DB-9 connector, and sound port for external audio amplifier or headphones.

http://applemuseum.bott.org/sections/computers/mac.html


> Even the 128k Mac had mostly standards-based connectors...DB-9 connector...

By the way it's a DE-9 connector (a DB-9 would be huge with lots of empty space). I know many people say "DB-9" when they mean the smaller connector, but then again many people say "narrowband"as the opposite of "broadband" when they really mean "baseband" -- just trying to make up words to sound sophisticated. Yes, even Apple marketing...


Because most people use broadband in a place where wideband should be used.

In my little social circle - any discussion that becomes overly detail oriented (focusing on details over the core argument) has been dubbed a 'DE-9' discussion.


The early Macs used ADB (Apple Desktop Bus), which uses a 4-pin mini-DIN as its physical connector. PCs use PS/2, which uses a 6-pin mini-DIN. They're not compatible at all.


True, but DIN is about as standard as it gets. http://en.wikipedia.org/wiki/Mini-din


But DIN is just the physical connector.


So what? Being a standard connector made it easy to repair the connector. What more would you want? Implement the ADB ASIC? Why on earth would you have done that instead of licensing from Apple?


Because that's the whole point of an open protocol -- hackers can design new interfaces not imagined by the original designer.

That's what API design is so important, and so difficult. By analogy the DIN connector is simply calling convention and ADB the actual function calls.


I'm not sure I follow you. ADB was ahead of its time, and a precursor to USB, but nobody's claiming it was an open protocol. What was important from an electrical engineer's POV was that the connector could be replaced using off-the-shelf parts. That was because the connector was standard. If you wanted to develop for Apple, you had to own a company and strike a deal with Apple. Not so easy, but certainly possible.

Now, if you just wanted to hack peripheral hardware at the time, you would have invested in a lot cheaper (expendable) computer with a standard serial / parallel port, so I don't really understand what you are saying.


While I wouldn't suggest that the App Store is free, a lot of people take issue with it while ignoring many other similar stores.

For example, the online shop on the PS3 or any game console is a similar environment, even Steam on the PC is similar. These types of curated software services for specific devices have been around for a long time. No one seemed to take issue with them until the App Store. Why is that?


Until not too long ago, the PS3 Store was a desert. There were some small games, but no blockbuster titles. This has only changed relatively recently, probably because the App Store is so popular.

It is a huge step backward. Before I could borrow games from friends or sell them. But I think there won't be a big outcry as long as most games are still available on a disk.


On the other hand the price paid for games has dropped massively, especially when indexed against inflation. Partly this is helped by digital distribution, but by creating a more direct relationship between the publisher and the customer, and drastically reducing piracy and eliminating reselling the required price per unit to make a profit as a game studio is really small. I've paid $5 for iOS games that gave me vastly more hours of enjoyment than games that I used to pay $50+ for.

For years those who supported the developers were subsidizing those who weren't. This is happening less now, and it's a more efficient market.


That's a good point. I probably bought more games from small indie developers on iOS the last few years than PS3 titles. Those indies wouldn't have access to the PS3 market.


Videogame consoles for the longest time have been a closed ecosystem. Steam is merely one way of adding computer games to a pc.

For the iphone the appstore is the only way to add applications. No one would care if they allowed competing marketplaces.


Yes, but that was exactly my point. Video game consoles have been around for ages, are closed ecosystems, and no one minds. The App Store decides to do this for phones (which were at the time, pretty closed ecosystems anyway) and everyone gets upset. What is the difference?


Windows Mobile, Palm OS, hell even later versions of Blackberry. None of these were closed in terms of app installation. Smartphones were not closed ecosystems, in fact, the point of them was to be open in this way.


I guess I was thinking more of non-smart mobile phones, since that was what most people used before the iPhone. But even so, it doesn't explain why other closed devices have been, and remain closed with no complaints.

Why do we not express the same discontent over a PS3? Or any other closed environment? I don't think it's because smartphones were "open" before the iPhone.


> Why do we not express the same discontent over a PS3? Or any other closed environment? I don't think it's because smartphones were "open" before the iPhone.

Two reasons:

1. The device that plays games is a lot less important than the ubiquitous pocket computer (and for some only general purpose computer).

2. Games, more-so than other software, benefit from the walled garden / curated approach. Nintendo more or less proved this out decades ago when they entered the console market.


1. But to game developers it is their most important device. I am a developer who would love to be able to test and publish my games on those platforms — but I don't mind that they keep the platform closed, because I feel it adds value to the games currently on there. (Developers are the main group concerned with the issue of open vs. closed.)

2. I would argue that your point also applies to the best apps on the App Store. The super-polished, simple and focused experiences that people love. They are hardly complex software, they are just well-thought-out design wise. By taking a curated approach you prevent the crud from building up.

I see it as similar to a fashionable clothing store — they buy all their clothes for cheap just like everyone else. The value they offer is in their curation and taste.

(In my mind Apple is not strict enough with their curation, which dilutes the value of their store. But I realise I am in the minority with this opinion.)


I think Microsoft are partially to blame for the existence of these app stores. By ignoring package management they created an environment where software was hard to manage and frustrating to install for users.

I don't think I've ever met anyone over the age of 40, from a non-technical background, that has ever checked for service updates to a piece of non-Microsoft software, downloaded it, and then installed it. To this day most people don't bother beyond Windows Updates (and only install those because Windows screams at them about it).

I've met plenty of Windows 8 users who haven't noticed 8.1 is out.


People took issue mainly because the App Store is the only way to get apps on iOS devices. And they took even more issue with the often random and vague rejections.


My example of the PS3 was the same as iOS — Sony is the only way to get games on the PS3. People don't seem to have an issue with this like they have an issue with the App Store. In fact, the same thing applies to the NES. No one seemed to complain until it was the App Store. Why is the expectation different?

I have mostly found the rejections I encountered from Apple to be good things (I've had about 8 - 10 app rejections over the years). They have always led to me improving the app for the customer (e.g., include battery warnings, make text more visible, ensure app complies with child privacy laws, don't include placeholder artwork, and so on).

The App Store's curation is one of the things that increases its value. Even though developers hate it (and perhaps feel slightly disrespected by it?). I'd argue that its been a good thing.


That's really a definition problem. For most people I know outside of IT, the App Store is the magical wonderland that frees them to do things they previously could not - or even did not know was possible.

By the time most people find what I find practically unfree (inability to access sandboxed data) they ignore it as just one of those things or try looking for another app. By the time a mass of people find this ethically unfree I suspect the App Store will simply have vanished in commercial swamps.


> Perhaps something else was meant by "freedom" in those days

Yes it did. This 30 year old quote has nothing to do with the App Store policies. IBM had a significant presence in every single piece of information technology in those days. Mainframes, typewriters, satellite communication, telecom equipment, etc. When they failed to move quickly on micro-computers / personal computers there was a lot of hope that at least one other major player could rise up and challenge IBM's ability to totally dominate every part of information technology systems. After they finally released the IBM PC they started to attack companies like Compaq who were making IBM compatible computers. For a period in the early 80s it looked as though IBM was going to totally dominate everything For this quote to apply to Apple we have to invent an alternative reality. A reality where about 85-90% of all information technology in use today has some direct dependency on Apple software/technologies/hardware/services. They don't even have that type of dominance in mobile today much less the entire industry.


Pretty sharp remark, thanks for shining light on that.


Might as well work out what $2495.00 in 1984 equates to today...

I used this site:

http://www.usinflationcalculator.com

$2495 in 1984 inflates to a hard to swallow $5,594.11, an increase of 124.2%.

Being a limey this this is hard for me to comprehend. So I looked at this page:

http://research.stlouisfed.org/fred2/data/EXUSUK.txt

There were $1.4076 per £1 in January 1984, which roughly equates as £0.71 per $1. So if I wanted to buy a brand new Mac 30 years ago, I would have had to cough up around £3974 in today's money.

A quick browse for IMac prices, around £1300 (~$2135, the dollar had a higher value compared to the pound in 1984 than today).

My father's IMac boasts a 2Ghz processor. How many of your vital organs would you have to sell on Ebay to buy 2Ghz of kick in 1984?

Here is a picture of a Cray X-MP:

http://images.tribe.net/tribe/upload/photo/8c4/74c/8c474c54-...

A Cray X-MP in 1984 would cost about $15,000,000 for 105Mhz, so I would require 19 Crays, costing a total of $285,000,000. Taking into account inflation, today that would be a bank-breaking $639,006,208.

I'm trying to remember the source, but I think the CDC 7600 packed 512Kb of memory back in 1969. The Mac had a comparable amount but cost about a 1000 times less.


The price of such a thing made my mind boggle as well. It makes me wonder what equivalent devices exist today in the '10x what is affordable' price bracket which we will all be enjoying and taking for granted in 30 years.

Any guesses?


> Any guesses?

I'm guessing a small proportion of the Earth's population will be cyborgs, but I am a Ghost In The Shell fan.

Thought I'd work out how much mass in UK copper coins would be required to buy 19 Crays ...

just over 72,000 tons, you'd need about 10 class 37 freight trains to haul that.


Mac Pro, 6-Core... AU $5,299

In the box:

* Mac Pro * Power cord

Can not afford.


Steve demoing NeXTSTEP 3.0 in 1992. http://www.youtube.com/watch?v=gveTy4EmNyk

Fascinating to see the Mac heritage in NeXTSTEP carry through the acquisition and transformation into OSX.

It's hard to remember, but in 1992 Microsoft had just released Windows 3.1 - NeXT was light-years ahead of Microsoft.


The thing that amazes me about Jobs is how he could present in such a way that it seemed so off-the-cuff and at the same time so polished. Anyone who has had to do presentations like these can learn a lot from watching him at work.

In the NeXT video, for example, he's very good at tying everything together into a cohesive whole, and speaking to his audience in terms they'll understand. He uses practically no filler words ("uh", "um", "basically", etc.).

Here's another presentation he did in 1980 that shows he had these skills almost from the very beginning:

http://www.youtube.com/watch?v=0lvMgMrNDlg


I think an advantage he had was time. Bill Gates does similarly well on his presentations from that era.

Unfortunately we don't have the chance to do 35 minutes presentations anymore. Now everyone has to squeeze their pitch into 30 seconds information-free commercials :)

How do you get meaningful ideas across if no one has the time to listen? Only the most superficial ideas can be put across in an elevator pitch.


My favorite part is Bill Atkinson's MacPaint demo at 42 minutes. I laughed seeing him mention how everyone thinks the paint bucket tool looks like a graduation cap - I always thought the same thing using it as a kid and never saw it as a paint bucket.


My favorite part is the demo. In the stricter definition of Demo http://en.wikipedia.org/wiki/Demo_%28computer_programming%29


Questions that the audience asked:

- Is there a version of BASIC?

- Is there a version of SmallTalk?

- After you become an expert, would the mouse became a feature or a handicap.

I really take the ability to write in almost any mainstream programming language on my Mac for granted.


> I really take the ability to write in almost any mainstream programming language on my Mac for granted.

Thank Stallman


Why? NeXTSTEP was a Unix and had multiple programming languages available to it--in 1989, without FSF-related anythings. Apple packs GCC with OS X, but it's pretty jokey to think they wouldn't just have packed BSD's compiler instead (and now Clang, which exists in no small part because Stallman and company decided it was a great idea to make extending the compiler hard because ideology).


Stop trolling. NeXTSTEP's objc compiler was based on GCC.


This is very, very interesting.

The first 15min more or less are what (most of us) know. Skip that

After that, they will show the real stuff. The manufacture of the Macintosh, and a lot of other things.

The deeper stuff. The laughable (now) things. A successful company yes, but still very vulnerable. A less experienced Steve Jobs.


I still remember when my dad brought home a Mac. He got rid of it and lamented that he had. I found him a Mac plus and gave him for his 64th birthday this fall. Thing still works. Great machine.


A few minutes in you see that famous tv commercial for the Mac. What I heard was that they only had to show it once, during the ad break for a football game.


It also was broadcast in 1983, in twin Falls, Idaho and was broadcast a few times more after the 1984 Super Bowl (http://mentalfloss.com/article/29867/how-apples-1984-ad-was-..., http://mentalfloss.com/article/29911/true-story-apples-1984-...)


Thank you for correcting me.


Live in public I guess, those two demonstration are very closed and it was not available publicly wasn't it?


Steve Wozniak seems to have been omitted from the credits in the slideshow that starts at about 20:40, and yet he appears in it at about 29:28. I wonder why he wasn't listed.


He wasn't on the Macintosh team.


He comes on stage later (Jobs says that Woz had just flown in from Seattle) and answers an audience member question about "The future of the Apple II" at 1:10:45."


FYI: If you don't do flash youtube-dl (in homebrew) will get this video without problems.


Woz is on stage starting at 1:04 in the video.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: