The whole interview was fascinating, but what I found most interesting was when he was talking about a "post PC world" (as in post desktop/laptop). He envisioned a world in which 95% of casual computing was done on a tablet or phone.
He said PCs would still be there for work, but most consumers won't own one, just like all vehicles used to be trucks, until people moved to cities and switched to cars, and now some people use trucks but most people use cars.
The other really interesting part was when he talked about the need for paid journalism, and how he "doesn't want to live in a society that's just bloggers". Even a decade ago he could see the danger in most people getting their news from un-curated, un-vetted content.
Two years ago I went from Macbooks to Win10 laptops with WSL for my professional needs - a setup that I now actually prefer, mainly thanks to having the same package manager as on the target server. Since a couple of weeks I’ve added an iPad Pro with pen and magic keyboard, for private use and note taking / concept drawing. I gotta say, I think Jobs’ vision has finally come true with this iteration, and more so with the next version of iOS. For private use for someone without time and/or motivation to tinker, an iPad is today all you need for 98% of use cases, and in fact it’s an improvement over PCs and Macs for those.
Win10 and WSL is a pretty compelling setup. It's interesting that Apple said they were working with Docker to make sure MacOS on ARM is a first class host for Linux Docker containers.
Right now it looks like MS has managed to steal a march on Apple in this area, but that Apple knows this and is working on it. Interesting times.
what do you mean? docker for Mac uses xhyve, which uses Hypervisor.framework aka “kvm”. wsl1 had a wine-style syscall shim but they switched to virtualization too for wsl2.
Right, and and xhyve, Hypervisor.framework, and kvm are all frameworks for creating and running virtual machines. In contrast, Docker for linux runs using a native container mechanism without using a VM at all. macOS doesn't support containers, but it could.
The general case is developing software on a desktop OS in containers that are intended to be deployed on Linux boxes. Since the whole point of containers is that the container runs against the host OS kernel directly, clearly that is not really possible on a desktop that isn't running Linux as there isn't a Linux kernel there to run against. there are two ways past this. Emulate a Linux kernel, or run a real linux kernel in virtualisation for the container to run against
WSL 1 chose the first approach so the container thought it was running against a Linux kernel but it was actually a windows kernel plus a syscall shim. That's all fine and dandy in theory, but not running against real Linux means there was always the risk of different behaviour on Windows versus on the Linux deployment host. In WSL 2 Microsoft switched to running a real linux kernel in a hypervisor, which is basically the same approach as on the Mac.
> xhyve, Hypervisor.framework, and kvm are all frameworks for creating and running virtual machines
> In contrast, Docker for linux runs using a native container mechanism without using a VM at all
Thanks for pointing this out. As I'm slowly getting familiar with container-based workflows and underlying technology, your statement helped clear up my confusion about the essential differences between containers and VMs.
I found an informative table in Microsoft's documentation for "Containers on Windows", that compares and contrasts the two approaches:
This helped me understand how Docker for Mac using xHyve is different from Docker on Linux with native containers as a much thinner layer with a shared kernel.
Aside from the container/VM architecture, a performance bottleneck of Docker for Mac is (apparently) how mounted volumes work, on the file system level.
---
> macOS doesn't support containers, but it could
According to a random Reddit thread (so, with a grain of salt) I found on this topic:
> Using a shared kernel container (via cgroups, for example) isn't feasible with macOS because of licensing restrictions and because the Linux syscall interface isn't compatible with macOS.
Which seems to imply that there's some fundamental reason why Docker for Mac must rely on VMs, that macOS is not (yet?) capable of providing containers that are as native as on Linux.
They also mention how the file system on macOS has design limitations that make containers more difficult.
> The net result is that performance is probably going to be near-native for light workloads, but macOS depends on other fundamental technologies like APFS which aren't designed for datacenter type workloads.
---
EDIT: Aah, right, this is why a comment higher up mentioned:
> Apple said they were working with Docker to make sure MacOS on ARM is a first class host for Linux Docker containers.
I thought docker on Mac used a virtual machine + disk resulting in slower file io? As opposed to running containers directly against the host kernel / file system like Linux.
There's a need for it, but possibly not a market for it. I think it's the same problem he identified for TV. Everybody knows what we have is broken, but there's no go to market strategy so what we have is a messy marginal market for set top boxes none of which can get solid traction. Similarly for high value news publishing there's no clear go to market strategy for a consolidated service like that, so what we have is a a messy market for news publishers.
But then maybe that's actually for the best. A single unified consolidated news publishing platform could have awful consequences for diversity of representation.
Perhaps it should be centrally funded but each individual gets to choose where to spend it. You get a voucher and it's up to you whether you spend it at Breitbart or the NY Times.
Democracy needs a free press, and not just free but one that is motivated by high quality fact based news. The high quality part is subjective but maybe achievable via a peer review group like a Journalist Guild type awards system. Publications would be required to need to maintain a certain standard of conduct to access the funds awarded by guild. Free of advertising to avoid click bait "news".
Perhaps such a model could co-exist with a commercial element. Access to premium services, to keep my proposal vague and aspirational.
I don't get too choose the BBC, if I give up the TV license then I can't watch ITV or Channel 4.
It's also a huge political football. It does lean slightly to the left but it doesn't matter which political party you ask they always complain it's biased against them. The current government has been waging was on the BBC for 10 years now. If they thought they could they would shut it down in an instance.
What we really need is something funded centrally but they don't have the option of interfering with.
"There's a need for it, but possibly not a market for it."
Whenever someone touches on market failures, I'm reminded of how the USA government saved the domestic oil producers from themselves. TLDR: Govt imposed regulation and predictability onto the wildcatters, thereby protecting investors and insuring stable supply for consumers.
I don't yet know what future marketplaces for journalism will look like.
I do know that markets don't just magically appear. Efficient open markets require laws and regulations, to create and then protect those markets.
I still miss the old Newsstand app [1] that was replaced with the Apple News aggregator in iOS 9. I liked having distinct apps for distinct publications grouped under the specially styled folder.
> Even a decade ago he could see the danger in most people getting their news from un-curated, un-vetted content.
I feel this is also a problem with music. The last 10 years you can just dump whatever you want on Soundcloud or go private release on Spotify.
Now, don’t get me wrong, I do not celebrate the label system at all, and maybe that freedom of getting your music out there is worth the cultural cost we pay, but for example what East Coast Hip-Hop had with Stretch & Bobbito, that kind of curation is sorely needed.
These days for curators it’s all about hype and whatever gets you a buck, but as was said in Stretch & Bobbito’s documentary “we’d play stuff that probably only appealed to like a hundred people out there. But it was dope, and dope deserves to be heard”.
On the contrary I think the music industry is way better now. Curation is already handled for us by algorithms like Discover Weekly.
I’ve discovered far more songs and bands through Spotify than I would have in the pre-Spotify era. And many of these songs are ones with just a few thousand views, as alluded to in your comment.
The modern music industry allows for the best of both worlds.
Music curation was done by actual people before, almost every social circle had some kind of DJ or music digger providing new stuff. It's another point where technology had a negative influence on the social fabric of society.
It still is done by actual people, just not as many who do it professionally.
The vast majority of my music recommendations come from friends and acquaintances who share tracks in FB groups, chat, amateur Bluetooth speaker "DJing" at gatherings and so on.
"Hey man, you gotta hear this one" is alive and very well.
It's easy to underestimate how important scenes are in music - clubs, musicians, DJs, producers, promoters, managers, vinyl stores, even musical instrument stores - all cross-networking and building organic scenes which incubate influential styles.
Scenes exist on social media, but they're much more diffuse and for some reason are more likely to tinged with irony and/or nostalgia. So they seem to be less effective and - ironically - less experimental and creatively diverse.
Amusing. This is still pretty common. In fact, I'd say there are more of these than before. The change you perceive is more likely to be a characteristic of you and your social circle (perhaps you are older, perhaps the friend who would have done that is now taking care of his firstborn, perhaps you've outgrown those groups).
You'll see this same thing in so many different places. People will say "It's not the same now. Back then it was like X". Most commonly, it's more like X than it is different, but the person saying it has changed.
The "problem" is the oversaturation with good enough or listenable stuff, meaning that careful curation by people doesn't have the social impact it had. Curators are middlemen and become obsolete similar to traditional media in aggregation theory.
A reaction or evolution from that is that many who were curators or DJ's get into music production, which means that they hang out in music studios of some kind, forming different social connections (e.g. with musicians). It creates a social divide between the people who just consume from algorithms and those who create.
>You'll see this same thing in so many different places.
Because it's an inherent feat of technology. Groups also form because different people bring different things to the table. One might be into music, the other one into cooking etc.
But cooking isn't enough, it has to be keto & vegan. People move up one level which changes the composition of their social circle.
>People will say "It's not the same now. Back then it was like X". Most commonly, it's more like X than it is different, but the person saying it has changed.
I wish it were like that because I'm a tech geek myself, but thinking more deeply about the topic and the composition of groups has changed my mind. The divide created by technology is real imo because it isolates groups.
I find I have a better experience discovering music by making a playlist and then doing the playlist radio to get more music similar to it. Eventually it just plateaus and keeps giving me the same artists as the playlist, but there's a kind of sweet spot where I've found stuff I wouldn't have found otherwise.
As long as your favorite music is “overproduced electronic music” in a couple different flavor this is the best time to be alive. Everything else is dying.
Spotify discover is great but they haven't solved the problem of flow. Songs that don't fit in context are played next to each other. I wish we had John Peel back in its place, but then I suppose it's on me to find a good dj.
This is precisely the reason why I prefer playlists to anything Spotify recommends. Playlists are human curated. I kinda wish Spotify just bought Pandora and their music DNA. I really mis Pandora’s recommendations
I strongly disagree on entertainment, what appeals to a specific person is a function of that person and the skill the performer has in the aim to hit that spot.
A performer, no matter how more talented that person is, will deliver a worse performance if they are trying to aim more broadly.
Whereas with news the issue is that news died when they optimized for ads/engagement, but that is fundamentally why Facebook, Twitter etc are all terrible. The interests of the user and the interests of the creator are mismatched.
> He said PCs would still be there for work, but most consumers won't own one, just like all vehicles used to be trucks, until people moved to cities and switched to cars, and now some people use trucks but most people use cars.
And I'm thankful every day that he was so wrong. Even though there are bad faith actors encircling the world of electronics and software attempting to turn everything into a walled garden (and succeeding in many cases), the PC lives. Hurrah.
On top of that his analogy makes no sense. By no means were all early cars trucks. Automobiles in cities didn't even really exist in any considerable number until after WWII and the suburban housing boom enabled by exclusionary FHA housing subsidies.
Before cars we all rode the streetcar and train. Open systems that allowed mobility without placing a premium on using your own two feet.
Is he wrong though? Any of us who operate websites know our traffic numbers. I'd say we have at least 80% of our traffic coming from mobile and tablet.
People are doing all kinds of things on their phones, including applying for jobs and getting government services.
The only thing I think Jobs might have been wrong on is the mix. The smartphone has become so powerful that the mix of smartphones to tablets is probably not what be predicted.
It's been interesting to me to see that despite staying at home all the time due to the corona virus, my smart phone usage hasn't really dropped off. I would have expected to be spending more time on my laptop, but it seems like there is something compelling about the medium.
I have a massive AppleTV about 10ft away from where I’m sitting. Just a few more feet away a kindle and an iPad, and just down the stairs my laptop and PC sitting on my desk. Yet I’ve been squinting at content on my phone the past hour or so. Is it just convenience/laziness?
My usage has increased so much I have considered getting a battery case!
I have two young kids at home, so I am doing stuff all the time on my phone. I can use my iPhone to check in on work while giving them a bath or watching them play outside and can give feedback to my team. Our mobile tools have become so powerful that you can get a lot done on them. I have a 5K iMac with 72 GB of ram, but I spend most of my days on a phone or a tablet.
I remember one keynote where he was talking about slowing PC sales and a huge rise in iPad sales (tablets were relatively new back then). Yet he didn't credit the two most obvious reasons - market saturation (everyone already has a PC) and slowing upgrade cycles - today's PCs are "good enough" for many more years than they used to be, as hardware improvements have stagnated.
Years later, the same kind of slowdown happened with tablets (especially Android tablets which are basically dead, but iPad wasn't completely spared either), and will eventually happen with smartphones as well.
The "we are now in a post-PC era" hype of 2010 (after the iPad's release) has eventually led to many "tablets are dead" articles by 2016.
As a datapoint to support your argument: I bought a new PC in 2020, the one I bought prior to that was in 2013 and it still runs here at home only relegated to server duty as the HW is good enough for that but lags behind for gaming.
I don't know how long I'll be able to run that server for before needing to upgrade something, but I suspect its lifetime is until something catastrophically fails.
2013 PC?! That would be brand new for my folks back home which are using my old gaming PC from 2007 with a Core 2 Quad and 6GB of RAM.
For Web browsing and office work it works flawlessly. One day I'll throw in a cheap SATA SSD that I'll find on sale as the OS drive and it'll be good for even more years.
Oh it was absolutely fine for everyday desktop/office type stuff, if I weren't into gaming I wouldn't have upgraded it at all. It still worked for gaming, but newer games were taking their toll.
Interesting n choice of word. I would say that a smartphone is pretty nice to consume things, but the moment I need to create anything other than a short email it is worth using a real computer.
The only exception that I can come up with right away is drawing, but even taking pictures, well my phone doesn't hold a candle to my DLSR.
Interestingly, when he was asked about walled gardens, he stated his adamant support for full HTML5 capability that would be on par with the app store, so you would always have access to anything you want.
It's funny because we all knew Flash was a scourge on the web and highly inefficient, and we swung so far around in the other direction that now Electron is a scourge on desktop and highly inefficient.
I'd say he is not completely right, yet. Many of my students type up their thesis on their phone (totally crazy to me, but I'm 'old'), and only switch to a laptop to get it into shape/format to turn it in. Give it some time and maybe a household will only have 1 laptop for all inhabitants.
Yes the computer still has a purpose but in my opinion the view on this is a bit blurred when growing up in the western part of the world.
Just take a look at countries like Africa or Cambodia: Most of the people (if not all) own a smartphone and there is a lot of places where you will have a hard time finding a computer.
I tried to use a tablet for work when traveling but it did not take long to spot the drawbacks and buy a real computer to do serious work.
The mobile ecosystems force us to do this because they stop us from unleashing the full potential of these tiny computers.
Imagine something like an iPad that would allow the user to just install any software that this tiny computer can run. Also you should be able to copy and access files from/to any other location on the filesystem. I could use it for most of my daily tasks at work if iOS/iPadOS would not prevent me from using the tablet in that way.
My favorite Jobs masterclass is his talk at Apple's developer conference in 1997 [1]. He had just returned to the company, and a lot of people were unhappy with him and/or Apple at the time. He decided to do an hour long off-the-cuff Q&A session with the developers. Famously, one of the questioners was harsh and insulting, and he handled it with grace.
I honestly can't think of another tech CEO that could have pulled off this performance. Maybe Bill Gates? I get the sense that the reality distortion field was very real because he was so clear and eloquent about his mission, it never needed to be scripted or rehearsed, it just came out of him at all times.
I think he makes a really salient point about focusing on problems in search of solutions rather than the reverse. So often in technology we come with solutions and look for problems to apply them to, but Jobs is right that we have to start at the pain points and work backwards from there.
Oh I remember this, I love that part so much. I often think about what he says about starting with the customer experience and working backwards to the technology, really simple but bright stuff.
The conference was not just for Apple employees but anyone developing within the Apple ecosystem. According to a Quora post [1], the guy worked for a security firm contracted by Sun. I’m not aware of any follow-ups with him though.
I have probably watched all of his talk more than a dozen times. The way he articulate his ideas, something that current Apple is missing. Tim Cook tries very hard to put those message forward, but for some reason he is just not very good at it. ( To me at least ) It doesn't feel sincere. You could feel Steve actually cares.
Mac OSX has been living a secret double life for the past 5 years. - Cant believe it has been 15 years since those words. Apple is about to make another transition. And Mac will reach new height, taking even more market shares.
It is sad Steve never gets to see this. Apple now has 1B iPhone user, 100M Mac Users, 200M+ iPad users, and a fast growing Apple Watch user base.
Another great insight from him is that companies go downhill once a sales person runs it, not a product person.
While I'm sure he thought Tim Cook was the best choice to take over, he's just not a product person, and it shows. I wouldn't say Tim Cook is sales either, just operations. He is very good at that, and he's done a decent job at leaving the product with product people.
But after 10-15 years it starts to show. Apple to me clearly isn't what is used to be. The UX issues are creeping in. The products aren't that innovative. There's no industry disruption any more.
>Another great insight from him is that companies go downhill once a sales person runs it, not a product person.
Here's the video if anyone hasn't seen it [1], think it rings so true to Tim Cook's Apple. The purity of the product vision just isn't there anymore and things are actively made worse for the user just to scrape back a few measly dollars: e.g no longer shipping a short power cable with $2000 laptops.
Think it's sad that a lot of the fans of Apple who used to say believe that it doesn't matter if Windows sells more, Mac OS is the better product.
Now if you say anything negative about Tim Cook's Apple you just get told how successful it is as if thats a metric for a good product. Just because something is successful and profitable doesn't mean it's good by following that logic there is literally no reason for the Macintosh to exist, everything before it was already successful and it was actually less successful for decades.
In keynotes now, I feel a little cringe every time one of the presenters uses a Jobs catch phrase. It's clear they are trying to imitate his surface mannerisms, instead of enthusiastically putting forth their own personalities and values.
His talk[0] at sloan school when he was young is also good.
His answer to buiness school students (some of whom were in consulting business) is quite revealing of his personality. Doesn't hesitate to talk straight in a very calm way. And very clear in his mind about the views he expresses. Also, very eloquent in putting his views forward.
> It would have been trivial for Apple at that point to hire a team or two to shoehorn a so-so version of Flash onto their devices. (This seemed like an even bigger mistake on the iPad which had just launched). But he is clear and adamant. They are going to make calls they believe are best to shape great products.
I don't really think the Flash decision had anything to do with "better products," and simply ecosystem control. Apple had always wanted to keep a very tight control of any ecosystem they inhabited. From their decision to stick with PowerPC as long as they did, to their zoo of Apple-specific dongles, to the Apple Store and beyond. Steve Jobs was a genius in understanding that an ecosystem is orders of magnitudes more valuable if you, as an entity, control it. Valve is another company that did the same thing.
> It takes courage to look at a feature that lots of people “want” and make the call to exclude it.
This is a lot easier to do when you control the ecosystem. It's not like someone could migrate to using a Samsung-made iPad or something.
> In a moment of self-deprecation/levity he quips “We never saw ourselves as in a platform war with Microsoft and maybe that’s why we lost”. It’s a quick flash of him taking himself less-seriously and it’s disarming.
This is absolutely untrue, and Jobs is playing the crowd. Not only did Apple definitely see themselves in a platform war, they doubled-down on this with the iPhone. It was always about platforms. Jobs just happened to lose the first time, but he decidedly won the second.
> Jobs: We are about making better products, and what I love about the consumer market that I always hated about the enterprise market, is that: we come up with a product, we try to tell everybody about it and every person votes for themselves. They go yes or no, and if enough of them say yes, we get to come to work tomorrow. You know that’s how it works. It’s really simple.
Whereas with the enterprise market, it’s not so simple, the people that use the products don’t decide for themselves, the people that make those decisions sometimes are confused. We love just trying to make the best products in the world for ppl and having them tell us by how they vote with their wallets whether we’re on track or not.
Not sure exactly how forthright Jobs is here, but -- honest or not -- man, I just want to say this is an incredible answer and motivates me to get off my ass and go build something :)
Internal discussions in Apple, in the hardware division where I was, were obsessed with the Flash runtime (and apps runnning on it) devouring battery. It's worth remembering that, at the time, the iPhone was quite new and establishing a foothold in the cellular product ecosystem, not the app ecosystem, wasn't guaranteed, and as such was very self conscious about battery life compared to things like a Treo.
I also ran flash on a HTC Wildfire around 2010. Flash sucked and drained the battery. But it was great to be able to access flash content. At that time, I frequently stumbled across content that I could not access otherwise. I was very happy that I was able to access it on my device, despite the crappy flash experience.
Take that as a data point that Flash on day 1 would have been good at least for one user. :)
Apple's decision probably still benefited the ecosystem as a whole because it drove the adoption of flash replacements.
That’s kind of my point. Everyone claims that Apple alone stopped Flash from being on the iPhone. But, Adobe could barely get it running on a 2010 era Android phone that had a 1GHz processor and 1GB RAM.
How would they get it running on a 2007 era iPhone that had a 400Mhz processor and 128MB RAM?
> I don't really think the Flash decision had anything to do with "better products," and simply ecosystem control.
The original iPhone was super resource constrained (battery, ram, cpu) which required rethinking how apps were handled, including that they could be killed anytime. This was very much against conventional wisdom at the time. Apps (and websites) also needed significant redesign to fit within the touch UX paradigm.
Flash's fate was likely first and foremost a byproduct of these constraints and the changes they necessitated. While I agree control always mattered to Jobs & Apple, so too did user expectations, which at the time meant flash on web. My strong suspicion is engineering realities (resources) and user experience practicalities (need for flash to adapt to mobile UX) drove the decision, which was then easier to justify from the perspective of control.
Yes, I'm not sure it was intentional but your post could be balanced by pointing out the advantages for the consumer when there is ecosystem control and vertical integration. Apple took a calculated and different approach than other mobile entrants and it has paid off.
When you look at Android, it has similarities to the Mac vs PC dichotomy except much worse, where the OS provider doesn't care what happens, the OEM manufacturer doesn't care what happens, and a few other players don't care what happens either (ISP, open source community). Only Google and Samsung attempt at caring, and even then it is a disaster as there is no way to getHuman().
> I don't really think the Flash decision had anything to do with "better products," and simply ecosystem control.
It sounds like you weren’t following tech news around that time. The decision about Flash may have had some elements about control, but it was largely about avoiding an unreliable, crashing, battery sucking and insecure piece of software.
When Steve Jobs wrote his open letter “Thoughts on Flash”, [1] there was a lot of criticism on Apple’s decision and predictions of Android wiping out Apple completely because Android still supported Flash. But people did come around to see that Apple was right about Flash and that it wasn’t something that was good enough to be included on mobile.
Eventually, Adobe was forced to stop development on it (to the extent it was spending time but still not making it better on stability, security and energy consumption).
Apple never intended to allow any VM to run on the iPhone. A hypothetical flash runtime, that actually worked well on mobile, would allow Adobe to run a application distribution platform on top of iOS, and threaten Apple's 30% cut.
They've been forced to relent on that hard line a few times, but Apple still won't allow Firefox on iOS for this reason, and they probably never will.
The fact that flash was absolute crap helped, but didn't change anything.
> I don't really think the Flash decision had anything to do with "better products," and simply ecosystem control.
This is doubtful. From what I recall of that era, Flash was a memory hog, and sometimes a CPU hog. Why is my tab/browser hanging? Flash. Also, at the time of the Flash ban, Jobs pushed HTML5, H.264, and native as alternatives (only the latter really gives control).
By projecting your perception of the present onto past events, you're distorting the historical context. Remember that the original answer to native third-party apps was "HTML5 websites are good enough."
Anyone who worked with him will tell you that all of the statements you've quoted are forthright.
>> It takes courage to look at a feature that lots of people “want” and make the call to exclude it. This is a lot easier to do when you control the ecosystem. It's not like someone could migrate to using a Samsung-made iPad or something.
No, but they could always not get an iPad. There's no control there, and most "walled garden" talk preventing users to switch is BS.
With so many fundamental apps being subscription (Office, Creative suite, etc), so many others free (Facebook, Instagram, Slack, Zoom, etc.) and tons of them on the web (from Gmail to Basecamp), plus lots of FOSS for us geeks, there's little thing keeping someone to one platform if they'd rather be on another.
Even music and video consumption is now subscription based for the big majority, and their accounts work in all Apple/Google's/PC ecosystems.
> It was always about platforms. Jobs just happened to lose the first time, but he decidedly won the second.
A different way of phrasing it: Apple didn't realize they were in a platform war with the Mac, which is why they lost, and Jobs made sure they didn't repeat that mistake with the iPhone.
I had always thought Job's attitude toward flash wasn't about control but was an actual vindictive strike against Adobe for not having his back when he needed them.
When NeXT got acquired by Apple and Steve went back to Apple he tried to get Adobe to make premier for the new Mac-OS. Since Apple hadn't been profitable for a while Adobe declined, so Apple made a deal with Macromedia to which became Final Cut. It's known that he held grudges and would base deals on how able he was to make demands or phone calls and it seems to be the case here too even if it was the right call technologically. This was related https://bgr.com/2016/12/12/steve-jobs-iphone-adobe-flash-tes...
Pretty refreshing to hear a company leader just say "we want to make the best product". No fluff, no ecosystem, just seemingly straightforward best products and best features. It seems almost bizarre today.
Most likely in part due to a majority of current company leaders simply being a product of the MBA degree-mill where they have no new ideas and are taught how to optimize what’s given to them.
I don't know what it is, but Kara Swisher annoys me. I feel her questions are usually of poor quality and I'm just not a fan of her interviewing style. Can someone fill me in on what has made her so successful? I think I may be missing something.
Try listening to Pivot podcast, might change your mind? I’d suggest it’s just a journalistic style of trying to ask the uncomfortable questions. Walt was often seen as maybe too deferential, particularly later on as books and other things entered the mix.
Two universally-derided Kara Swisher interviews are the joint Bill Gates/Steve Jobs panel at D5 [0] and the Elon Musk Recode interview (go to time stamp 7:00 in particular) [1].
I am not clear on what you are saying. Can you rephrase? I _am_ asking what I may be missing given so many people seem to like her. What are you saying I should have asked instead? Thanks.
That’s a common heuristic, but it’s used before making up your mind. After you decided that you don’t like someone, would be really hard to see them in a positive way, because your mind will try really hard to confirm you are right about disliking her
To @mh_, the poster of this article on HN and the author of the thinkst blog:
Your site doesn’t have a responsive mobile layout on iOS. All I see is a weird version with large fonts that I either have to pan around to read or a “web version” that provides the desktop layout with tiny fonts. You may want to test your site with don’t blockers and ad blockers (I use both).
> It’s super telling that this 55 year old billionaire would be “working on a presentation he was giving” at 02h00 in the morning. It is probably possible to do great things without burning the midnight oil, I’ve just never seen it personally, or been able to do it that way myself.
It is probably possible to survive in good health and shape and reach your 80s while continuously burning the midnight oil, I’ve just never seen it personally.
Sure, "burning the midnight oil" is something that can (and IMHO should) be done sometimes. But is that the only (and main) key to success? I doubt it. The lesson about focus is far more valuable.
I love reading about Steve Jobs, but I get nervous when Apple's moves get put on a pedestal and blindly emulated. I think a lot of industries are tempted by the myth of Steve Jobs to go for vertical integration. Sometimes a better move is for a company to ride on the waves of innovation in layers below them in the technology stack/supply chains. If there's enough innovation below, picking horses like apple loves to do (firewire, etc.) is a bad move. Setting up your company to benefit from innovations below you is what won Microsoft the PC-wars.
> There’s been a huge push in recent times for balance, wellness and rest. People are quick to point out how burning the midnight oil translates to diminished capacity and actually increases your rate of errors, but I’ve never seen consistently great work from people unless they too, were working deep into the night on projects they believe in.
Remember this interview being particularly frustrating with them constantly pressing him to reveal what their future plans are despite absolutely everyone knowing he'd never actually tell you anything so it's just wasting time.
Sorry that this is off topic, but what’s the situation with the term “masterclass” now that we’ve determined that we shouldn’t use the word “master”. Same with “masterpiece,” “master craftsman,” “headmaster,” “master Jedi,” etc.
edit: I’m genuinely curious about this, and don’t have a position on the “master” thing. Not trying to start a flame war.
We've never "determined" that. It's merely that some obsessive bossy people suggested it. And mainly people from just one country with particular issues -- whereas English is spoken in just about everywhere in the world.
Not to mention the word master has a history beyond slavery, and was used in all kinds of contexts historically with absolutely no reference to slavery.
The etymology itself has nothing to do with slave masters (any more than "owner" implies "slave owner").
master: from late Old English mægester "a man having control or authority over a place; a teacher or tutor of children," from Latin magister (n.) "chief, head, director, teacher"
I'm curious as well. I 100% support black lives matter and started to rename some git repos from master to main but I notice this master used in git isn't a master/slave type of master but the master copy, the version that every thing looks to as the source of truth. This is not the same as the original. Version 75 is not the original, version 1 is, but it is the master.
I don't mind renaming those to main in fact I started looking into automating the conversion but at the same time it does seem like mission creep. main doesn't actually have the same meaning. In pressing CDs/DVDs/Blu-rays we have "gold masters" which are used to create the molds for the presses for the discs. A master key opens all locks. Those usages all seem unrelated and arguably so does the git usage.
Removing master/slave from tech seems 100% correct. Removing all uses of the word "master" like Darth Vader "I was but a learner but now I am the master" seems wrong and if we aren't going to remove all of them then why are we remove it from git since like those others usages it and zero relation to the problem?
You can think of it more as a return to normalcy, "master" in VCS is relatively unique to git.
p4/piper is MAIN
SVN is 'trunk'
mercurial is 'default'
--
I'd be upset if we were changing commonly held terminology or if it had more meaning.. but it's completely arbitrary and all it costs is the minor reeducation of all github/git users and the invalidation of all prior documentation.
The piety of those who wish everyone to be re-educated and the corpus of documentation to be rendered invalid would be sickening. At least in this instance it brings concepts more in line.
We need less confusing and diverse verbiage in the industry generally speaking, when we refer to the same concepts (server/instance/machine/node being examples), so I think the change is welcome on those merits alone, and if it happens to make people feel better, though there's no evidence of that, then great.
However it's important to understand that if the motivation is purely political, then this can be classified as political correctness, which has proven time and time again to be ineffective and actually /harmful/ to the left; as the right are clever to use it as a recruiting tool- often openly pissing off the left and making them hypersensitive to anyone not towing the line. This was true in the early 90's and it's true again today.
To quote Stephen Fry on political correctness: "if someone wants to shout faggot at me, I don't care, as a gay man. I know I'm supposed to, but I'm supposed to care on behalf of people who are, supposedly weaker than me. and I think it's the most patronising thing in the world. It's exactly the same political correctness that I grew up with which was then, the kind of religious political correctness; which is people complaining about television programmes, about swearing and nudity and violence: 'I am not shocked myself, it's just the vulnerable young minds, you see!'; well, fuck that, that's just not good enough. It really isn't. and that's my objection. it's.. denouncing from the pulpit.. I mean, Russia has political correctness, but in Russia the political correctness is that you can't say Tchaikovsky was gay."[0]
These comments were given before a debate (which unfortunately was derailed frequently) on the efficacy of political correctness; and he made detailed points about words being immediately co-opted to mean hateful things, if there is hateful intent, and largely intent is the most important factor to control for, certainly not language.
I will link to the full debate below;[1]
So, typically I'm against these kinds of measures, and that's the foundation on what this change was about.
If a change can stand by other merits, then sure, but to assume that this will help even a little, with no evidence provided- and to attribute such little weight to the human time in reconfiguring and reeducating is not just patronising in of itself, it's a little dehumanising to those it supposedly supports and forces re-education and labour on the entire development world.
I believe that to be immoral; unless, of course, the change can stand on other merits.
Master as a word doesn't even have to do anything with slave master, anymore so than "owner" has to do with slave owner.
Slave master is just a subcategory of uses of the word master. Here's the etymology, as per the dictionary:
master: from late Old English mægester "a man having control or authority over a place; a teacher or tutor of children," from Latin magister (n.) "chief, head, director, teacher"
mægester comes from latin magister, which in fact did include aspects of slavery - in the sense that a magister is in control of X, whether it be a domain of a science, particular skillset, or people.
The corresponding word for slaves was in turn explicit about "someone who can't refuse order".
> Removing master/slave from tech seems 100% correct. Removing all uses of the word "master" like Darth Vader "I was but a learner but now I am the master" seems wrong and if we aren't going to remove all of them then why are we remove it from git since like those others usages it and zero relation to the problem?
Given that GitHub and git-scm approved of removing this somewhat "problematic" terminology despite the original context being unrelated to master/slave, etc. Where would you draw the line on replacing these terms?
Will there be some nuance in changing said problematic terms, or will this wave of corrections swallow everything in its way. imo, there are some obvious problems with 'whitelist-blacklist'. But 'master' tends to be on the harmless side on the spectrum of problematic terms? I'm open to changing my views, tbh.
No, _we_ didn't decide anything regarding the perfectly acceptable use of the word "master". The enraged mob has demanded it -- don't reward mobs.
To use the word to make light of the history of slavery is vulgar and socially unacceptable. The myriad other perfectly innocent uses of it are just fine. Personally, I have no intention of obeying any of the new rules of the new Cultural Revolution.
I dont ever view the Master in Github as something that relates to slavery. It is like Master Key, where the secondary is change key. But it turns out I cant even use the word Master Key any more.
Well that depends on 'how far do they want to take this' i.e. where do they draw the line on what should be replaced or not.
Does Mastercard apply to this change? Perhaps being a blacksmith is 'offensive' as well. Maybe having a 'Masters' degree from Yale is just as bad as saying Headmaster or 'Dark mode' and the list goes on. As soon as you remove the context and start racializing everything, it doesn't make any sense to immediately associate it as 'racist'.
It will get to the point where someone will eventually reject such changes and will risk of being called a racist for not supporting it.
I recently had the following exhange on the subject:
[reply to me, after I suggested the word "master" is not a bad one but the act is]
>> I agree that the word isn't necessarily bad itself, but the issue is that humans don't operate on dictionary definitions; we operate on connotations. Master/slave terminology is kind of vague alongside it referencing/being a product of immoral history, so why not make our terminology more accurate and stop normalizing the old terminology?
>> If you were a person whose ancestors were systematically enslaved getting started in tech, would you say "oh those aren't bad words, they're just words" or would you be a little off put that the field glorifies historically racist terminology by using it everywhere that it's vaguely related?
[my reply]
> If you were a person whose ancestors were systematically enslaved
I am such a person.
> would you be a little off put that the field glorifies historically racist terminology
I think this is the distinction, there's no glorification being given, the words we tend to use in computer science and operating system design are relatively clinical.
There are many points here to argue; you could say that people without parents may be put off by 'orphan', or people who have had siblings or loved ones killed as children could be put of by the terms "kill" and "child" especially as they're often used in conjunction.
In fact that last one also applies to me as one of my childhood friends was murdered.
There's other, potentially less harmful terms, like "class-based" programming, if you know anything about Britain and especially the British Raj you know that this is a very touchy subject. In fact it has far reaching implications in India today.
The application of a word should not be marred by the history of a word, especially words that near-universally apply.
> so why not make our terminology more accurate and stop normalizing the old terminology?
I can think of three arguments against, and they are weakly held so don't think I'm being absolutist or combative or that I can't see reason:
1) There exists a concept called the euphemism treadmill: simply stated it says that the words we typically use to refer to "bad things" is co-opted and abused until it becomes a bad word itself. If we cannot refer to slavery, then we will likely use other words which are loosely related and then those will become bad words also. It is better to simply state that we're not referring to the /human/ slavery, merely the relationship of components.
2) There is a large corpus of work and workers in the world; it might be meaningless to change a few words universally, but you're implicitly asking that potentially hundreds of thousands of people use new language (where english may not be their primary language and so attach to key-words) AND potentially making breaking changes between software revisions AND invalidating any existing documentation/blogspam that refers to these words.
3) There's no actual end to these kinds of changes.
As soon as you draw a line you exclude people. My examples above about orphans and people who have been affected by child murder may seem absurd, but why are they less important? Twitter already announced that they would avoid using 'dummy value' because dummy can affect people.
reason 3 is not an application 'the slippery slope fallacy', it's the fact that you _must_ exclude some subset of people if you chase this way of thinking, there must be a line drawn at some point as it's fractal and the human brain is exceptional at finding reasons to be upset.
It is infinite.
Think, even momentarily how a woman must feel upon persistently hearing the word ABORT.
And, I'm mostly for changing these terms in new software; but changing existing software in this manner makes it political and not technical; the truth will always be that there was no racist intent when people chose master/slave to refer to the relationship between worker objects and controller objects- it was the clearest term to denote the relationship (and, continues to be if you didn't already know this relationship, try it each way with a few nontechnical friends).
You may disagree but I think this conversation with Stephen Fry is telling (directly in relation to this topic, and not generally): https://youtu.be/vsR6LP7Scg0?t=420
Very relevant and important issue. Like the other commenter in this thread we have also already started the process of renaming all our “master” branches to “main” for this reason.
It has to be taken with context, partially. “Craftsman” is gendered and therefore its usage is sometimes problematic. “Headmaster” also. The word “mastery” is not at fault, but “master” to denote male gender or indeed relationships, that’s a problem. In this case though, it was a default GitHub had established or continued... so they’ve tweaked the language to make people feel a bit more comfortable. Change is annoying but in this case, I see it as inconsequential and unrelated to words like “masterclass” or “masterpiece”... for now, at least. Word usage and meaning can change over time...
"master" isn't gendered when used as a verb or adverb, and I would think masterclass refers to those usages.
For example, it makes sense to say "she mastered her fears" but not that "she mistressed her fears". Similarly, "she mastered fer field" works but "she mistressed her field" does not.
Really, it's just the usage as a noun that's problematic from a logical standpoint, as I see it. If the usage relates to "control" as master denotes as a noun, then it might be problematic (even if the historical use of controlling slaves is not the only use). If it refers to proficiency, or overcoming a hurdle, then it's not really related to the control aspect of the word, and the problematic history that comes along with it.
Now, whether society, or parts of it, are able to distinguish those usages from each other or whether that causes contention that is better avoided, I don't know.
Thanks for the response. I’m also curios how other languages are contending with the backlash against gendered words. Everything in French, Spanish, Italian, etc. is gendered, right?
My understanding is that outside of names or ways of describing yourself, others and actions, in other languages you’d tend to disassociate the gender of a word from the word’s usage. As in, a word is not problematic because it has a masculine gender, but rather in its usage and meaning if it excludes women/minorities/doesn’t use inclusive language. Perhaps a similar word in English would be “perfume” which is perhaps similar to a feminine-gendered word. The male version would be “cologne”. Neither word is especially problematic — the concept, scent products, doesn’t particularly exclude either gender, though related social uses and norms might.
That said, language definitely changes viewpoints. Things don’t always map 1-to-1 between languages, and a lot of cultures have different ideas about when and how to make these kinds of corrections. Sometimes language is set centrally, sometimes slang and memes propagate new terminology or reference current events. Words change the way culture changes, or so it’s been my experience...
He said PCs would still be there for work, but most consumers won't own one, just like all vehicles used to be trucks, until people moved to cities and switched to cars, and now some people use trucks but most people use cars.
The other really interesting part was when he talked about the need for paid journalism, and how he "doesn't want to live in a society that's just bloggers". Even a decade ago he could see the danger in most people getting their news from un-curated, un-vetted content.