Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's hard to overstate how bloated Apple was internally.

Marketing had products for every conceivable niche. Engineering was all posturing: great hand-waving plans papered over nasty middle-management infighting. Hundreds of engineers in new glass-walled offices, producing plans. I wrote test code against a component that had been delivered months earlier only to find that it was just a stub. Even the debugger had bugs. Everyone knew it was a mess, but went along with it, fatalistically thinking that any OS-level project would be that messy.

I went across the street to JavaSoft: small teams cranking out code that would last forever. Swing was built in a year. Signs on the offices not to disturb the programmers. ~1,000 classes in 1.1, ~10K in 1.2 the next year. One main engineering manager hired a bunch of kids out of college. The JDK tech lead, Mark Reinhold, is still at the helm today.

Night and day; it was like going from the Soviet Union to the U.S.

When considering a new job, I almost don't care about technology. Engineering culture makes all the difference.



> Marketing had products for every conceivable niche. Engineering was all posturing: great hand-waving plans papered over nasty middle-management infighting. Hundreds of engineers in new glass-walled offices, producing plans. I wrote test code against a component that had been delivered months earlier only to find that it was just a stub. Even the debugger had bugs. Everyone knew it was a mess, but went along with it, fatalistically thinking that any OS-level project would be that messy.

This sounds like the company I work for presently


What do you look at in engineering culture?


I would assume there is no one thing to put a finger on. But there are some signs one can take a note of. Generally speaking, to name a few:

- The ability and willingness of the management to invest time and resources into development of proper work processes and infrustructure (imagine having to code without version control in a team)

- Understanding that things aren't always done the moment they appear to be done

- Realization that many things are done to lower the cognitive load exclusively (why do we have to spend 20 hours refactoring the thing if it already works?)

- Understanding why it is important to lower the cognitive load

- And dozens more

These are merely signs, though. One can put 'dnd' signs on doors, but what difference does that make if the same people who introduce the signs still disturb the people behind the doors whenever they feel like it. (no pun intended, couldn't word it better)

It comes down to understanding the nuances, and mutual respect, I think?


> It comes down to understanding the nuances, and mutual respect, I think?

I think that's a great summary.

Look out for companies that try to manage what they don't understand. It can lead to the 'car mechanic phenomenon'. I call it that because it's like the feeling some get when they get the bill from the car mechanic. They can't fix it themselves, so how do they know if someone's pulling one over on them?


100%. The companies and programs I've been a part of that failed, this is why. I've been meetings and made what to most engineers would be a truism, and been told by leaders that didn't have even a basic understanding of what we were trying to make "you don't know that, that's not true". In similar situations in successful management lead to "ok, I trust you" and then success. If an organization spends too much of its valuable resources on internal mistrust and convincing people who don't know, it's doomed.


If people have to enter time spent into any system then thats an no from me. Which means I cannot work for consulting companies :-).


I had a friend interview at "Big Nameless Corp" a few years ago. He said when he saw that the engineers interviewing him all had Windows laptops, he knew immediately it wasn't an engineer-friendly culture. I always thought that was funny.


I try to stay pretty open, but I work on an org with a lot of windows developers. It’s kind of interesting how it’s a totally separate world and the windows devs have their own equivalents to dot files and stuff.

The problem is when they insisted that everyone else must use windows and Mac and Linux machines weren’t allowed.


>"He said when he saw that the engineers interviewing him all had Windows laptops, he knew immediately it wasn't an engineer-friendly culture."

"Friend" should have his head examined. This kind of snobbery attitude does not do any good


It is not snobbery and you are taking it the wrong way.

It is more about having plenty of mature developer tools at your finger tips to get the job done with lots of automation. Also having source of everything allows you to check how something is architected and designed and also improves your own code.

It is very painful to work on Windows laptops when you don't have access to proper dev tools which is the case in most of the corporate setups where they want to uniformity of images to make administration easy.


I work at a BigCorp. A sister team develops client software for Windows. Can you guess what kind of laptops the engineers have?


I don’t really see how that’s relevant given that WSL exists.


WSL in Windows 10 is almost useless. Also for embedded and people working on kernels etc it is really painful.


WSL is a business checkmark used to keep engineers quiet. It's not a pleasant surrounding ecosystem and the implementation isn't as high fidelity is the real thing.


Exactly. It is painful in Windows 10. Nothing beats running natively.


And how do you even "look" at it from the outside? Ask interviewers and they will always tell you that they have great culture, teamwork, etc


Easy,

Ask where the documentation is.

Ask how much input engineers have on what they are working on.

Ask how many meetings hours they average per day.


>Ask how much input engineers have on what they are working on.

This is the way. Ask this question of people working at a legacy company and they won't be able to meet your eyes.


> Ask where the documentation is.

I've had a reasonably long career and worked at a number of companies at this point, and I've never seen this 'documentation' that engineers are supposedly producing.


I think existence of documentation is a poor metric, personally.


I think a better metric is to ask why the documentation isn't better. Nobody is ever happy with their documentation, whether they have too little or too much. But if the engineers think it's because they're being pushed too hard and there's no slack time available to write docs? Red flag. They have explicitly considered the question and realized that the effort to write the docs, compounded by the need to keep them up to date and/or the cost of allowing them to be out of date, is not worth it in their particular situation? Good sign. They feel kind of bad and guilty about the state of the docs? Too normal to be much of a sign, but still positive.


These questions are ultimately just asking what the work culture is like, they don't really have anything to do with documentation. And in that sense they're good questions to ask during an interview. As you allude to though, I don't think you're going to get good information if you just ask surface level questions about docs.


The meat of the question is, "where do you go when don't know how to do a thing?".

The answer could potentially be "the docs", but that's not necessarily sufficient like you point out. However, you could ask follow-ups, e.g., whether the docs have multiple facets like user guides in addition to low-level API guides.

OTOH, where you go when don't know how to do something doesn't necessarily have to be the docs, and info here can still be valuable to understand. E.g., are questions typically discussed in public forums or is it discussed in private DMs? The former is typically a better signal in my experience.


Compare the quality in the documenation from lawyers and engineers in the company. True experts tend to be excellent communicators. SJ adequately described the short comings of Flash the phoneys pushed back against without merit.


We really have the opposite problem, there are lots of documentations but they are not exactly organized or all up to date. Much easier to fix than not having any of them I would guess.


Welcome to health care. We produce a ton of documents that then disappear into a terrible document management system to never be found again.


I feel like having too much doc is not really a problem :)


Ask if I find a bug in the code, what's the process to getting it fixed? And about how long is that?


Thank you. These are valuable pointers.


Respect for the craft and those that practice it.


> When considering a new job, I almost don't care about technology. Engineering culture makes all the difference.

This resonated with me. Whether you are recruiting or looking culture is incredibly powerful.


This is hard though because culture often isn't very legible.

I'd easily trade pay (within reason) and any tech stack for a truly great culture where I felt driven and engaged and was doing meaningful work, but the latter is really hard to understand ahead of time.


It is. However, in earlier phases people can recruit from their networks.

They will only be able to sell their best contacts with trusted tales of good things.


Right before that era I worked on Mosaic for Windows, and the Mac developers sat across the stairwell from us. The Mac guys had a very love hate relationship with Macs.

On the one hand the lead was infinitely proud of being able to have 2 (or was it three?) monitors on his desk, big CRTs on a desk designed during the Cold War, by a designer who had nightmares about nuclear blasts and wanted someplace safe to duck and cover. You could do that with Macs but not quite yet with Windows. If memory serves, Linux got that ability before Windows did but don't quote me on that.

On the other hand they were writing what is ostensibly a concurrent application on an operating system with no protected memory and no pre-emptive multitasking, so the whole thing was using hand-rolled cooperative multitasking via C longjumps. It's no wonder the Windows team had an easier time keeping up with Netscape for that golden year. They were from what I understand cross compiling, and the Windows team could just do Windowsy things with a fifth of the staff.

So my experience of this era, through that lens, through learning to hate Macs at the hands of Mathematica, and also through rumor mills, was that a lot of Apple's OS people ended up going to Palm, where they made pretty much the same set of tradeoffs. It was very weird watching subsequent Palm models start to bump up against the same ceiling that NextStep was helping Apple route around.

I told my Mac loving friends to talk to me when Apple had a modern OS. So when NextStep merged my ears pricked up. My first Mac ended up being an anomaly. Apple briefly produced a 13" Mac with a discrete video card, which they haven't since. I had been struggling to get Linux drivers for a Fujitsu LightBook, which was ridiculously small, but was practically obsolete by the time I got everything working. That device is the sole time I've contributed to Linux, which was cool but exhausting. So I ran to a pretty UI, works out of the box, but ships with /bin/bash with open arms and never looked back.

You could still see the bones of NextStep in OS X for some time.

The university ended up with upward of 60 NeXT machines (which I later learned is a lot for one college), in 2 labs when I started, eventually in 3 that I knew of, one of which I ended up with after hours access to. On many occasions they were the only open machines. I had helped too many people who lost their papers to faulty disk drives and learned that the best way to write a paper was to keep my Unix account empty and mail myself copies, so it hardly mattered that I didn't have a floppy for the NeXTs. It didn't hurt that they never figured out how to meter the NeXT laser printer, so while it wasn't the best or fastest printer on campus, it was the only free one. "Your printah is out of paypah."


> So my experience of this era, through that lens, through learning to hate Macs at the hands of Mathematica, and also through rumor mills, was that a lot of Apple's OS people ended up going to Palm, where they made pretty much the same set of tradeoffs.

Not just making the same tradeoffs, but making a lot of the same software -- early PalmOS was effectively a handheld remake of Mac OS. Same CPU architecture, very similar OS design. Some of the A-traps even had similar or identical names.


>On the other hand they were writing what is ostensibly a concurrent application on an operating system with no protected memory and no pre-emptive multitasking, so the whole thing was using hand-rolled cooperative multitasking via C longjumps. It's no wonder the Windows team had an easier time keeping up with Netscape for that golden year.

As one among your customer base for both versions, indeed. People who never used pre-Unix MacOS have no idea how unreliable it was. Windows 95 and 98 weren't great, but there was at least some hope of killing an errant application and continuing on. System 7? No hope whatsoever. It didn't help that Mosaic (and Netscape) wasn't very reliable regardless of platform, but the OS's own failings made things that much worse.

>a lot of Apple's OS people ended up going to Palm, where they made pretty much the same set of tradeoffs.

That makes sense, both from an attractive-new-startup view (I'm sure many within Apple c. 1997 was pushing for a small, inexpensive Apple PDA to respond to Palm), and from a familiar-feeling-OS view.

>So I ran to a pretty UI, works out of the box, but ships with /bin/bash with open arms and never looked back.

I figured this out on the day in 2003 when I first tried out OS X. I've been using Linux since 1995 and had tried every available desktop: CDE, KDE, Gnome, Enlightenment (The horror .. the horror ...), Window Maker/AfterStep, fvwm, and even older ones like Motif and twm. I'd used Mac OS 7 and 8 in college and hated it (as mentioned above), but OS X was a revelation.

I still use Linux as a server, but for a Unixlike desktop that actually works and runs a lot of applications, OS X is it. Period.

(I wrote the above on Slashdot ten years ago <https://slashdot.org/comments.pl?sid=2940345&cid=40457103>. I see no need for changes.)

>The university ended up with upward of 60 NeXT machines (which I later learned is a lot for one college)

I don't think my college ever deployed NeXTs in public student labs the way it did deploy HP workstations <https://np.reddit.com/r/VintageApple/comments/ludshu/macinto...>, but I did use them in college as well. I still think NeXTStep did UI better than MacOS pre- or post-Unix.


System 7 came out in 1991 while Windows 3.1 wasn't even released until 1992. Apple had a huge lead but they squandered it on failed rewrites.


> A lot of Apple's OS people ended up going to Palm, where they made pretty much the same set of tradeoffs. It was very weird watching subsequent Palm models start to bump up against the same ceiling.

Didn't know about this. This is before ‘webOS’, right? What kind of OS problems are we talking here?


A long time ago now, but I vaguely recall that the original PalmPilot OS chose to replicate the mistakes of the original Mac. Data structures exposed all over the place, apps had to implement the basics of event handling rather than have most of it handled for them.


I'm not sure what services PalmOS provided to the apps, but to my limited knowledge the apps themselves were non-multitasking, and had a standardized database per each app instead of the filesystem (i.e. something like proto-IndexedDB, but more oriented to documents or blobs). So it's kinda difficult to imagine them having the problems of quasi-multitasking and more complex MacOS. But perhaps things became involved by PalmOS 5.

Though the event handling part sounds like DOS' raw approach, versus Win95's abstraction—and I guess could plague any kind of a system.


Palm OS was designed for the hardware of the time. It was very simple but ran well. Compared to Microsoft's approach of building a stripped down clone of Windows NT that barely ran on the hardware of the era. But in the long term Microsoft's approach was more successful because as hardware improved they had an OS ready to take advantage of it.

Of course, what Microsoft wasn't ready for was the hardware getting so good that Apple could just port their existing desktop OS and software over.


> What Microsoft wasn't ready for was the hardware getting so good that Apple could just port their existing desktop OS and software over.

Eh, I mean, MS attempted that more that once themselves: though WinCE wasn't Win9x inside, it sure looked like one, and purportedly even had swap.


For example, the original 1984 Mac hardcoded the screen resolution to 512x384 black & white so they had to rewrite QuickDraw just three years later to add color, different resolutions, and multiple monitors. Classic Mac OS was easy to crash because it had no memory protection.

In 1996 Palm OS had hardcoded screen resolution with no color support. They had no memory protection. Palm also used already-obsolete DragonBall/68K processors when faster ARMs were already available.


I think he talks about a very different era


> "Your printah is out of paypah."

Oh man that brings back memories!


Part of that may also have been the greenfield development nature of JavaSoft; it's often easier to make rapid progress if you don't have to be backward compatible with preexisting APIs etc.

And from what I heard, Sun as a whole was a highly dysfunctional company as well, even during the time in question. The dot com boom just papered over a lot of the dysfunction for a few years.


Marketing had products for every conceivable niche.

Ah, the Performa era.

https://en.wikipedia.org/wiki/Macintosh_Performa

Mind you, my first Mac was a Performa, as we sure didn’t know better.


Obligatory story how someone tailgated to apple’s office long after being fired to finish his project around ‘94 I think: https://youtu.be/Dl643JFJWig


I wrote a 1.2 Swing desktop app with a JNI DLL to call Win32 to create a desktop shortcut. The hilarious part is it violated the EULA because a JRE was shipped and ran from a CD. It took about 30 seconds to load, so it seemed very important. :) It was themed with a JPG in the background to paint the window object's canvas.


I agree, not almost, 100% care about engineering culture only when considering a new job. Poor tech design cam be fixed by a healthy engineering culture, an unhealthy engineering culture can't really be "fixed" (only eliminated and re built well).


Well, it's much easier to write a brand new language and class libraries. Everything is self contained, there are no dumb users involved, no undocumented external functions, most of the library functions can be ported from existing languages etc.


> Swing was built in a year

Checks out. I'm still bitter towards Swing though. But impressive what they could put out in a year.

That said, I'm absolutely NOT a fan of the JWZ sleeping bag under your desk/get it done whatever the cost mindset that was everywhere in the 90's.


FYI: JWZ has long-since disavowed this attitude, and much else.


Bitter towards Swing?


Yes. Swing was not good, we had better idioms before and after.


Any stories you'd be willing to share from your time at JavaSoft?


I wish companies would be run in a way that removes insane leveling completely.

In my current company there are 7 or 8 levels (depending how you look at it) for ICs. Why?

IME it has led to so much “talk” and bloat, and useless meetings led by people trying to prove something who go on and on about things that make no sense. No execution, only flattery and BS.

IMO, the only levels you need should be Software Engineer, Senior Software Engineer, and Software Architect. The architects should be rare, and there should be a healthy mix of seniors and juniors focused on building and supporting products internal and customer facing.

Why do we have several layers of managers? Out of a dozen managers only two have been good. They’re also the first ones fired. It just seems stupid, even if it’s purposefully done that way (that’s even dumber to consider). Have lean teams with a lead, and a manager who manages several teams, and a director for each product offering who reports to a CTO or something. VP, SVP, EVP. Why?

I’m on a team where people waste so much time and yet I see those same people get promoted, while I’m told I’m disengaged because I don’t turn on my camera in bullshit meetings (to plan future meetings or ramble). On the other hand, my team consistently delivers on time or sooner, while those teams take forever.

It’s sad, really.


> In my current company there are 7 or 8 levels (depending how you look at it) for ICs. Why?

Promotions tend to represent some combination of four things to most companies:

• more expected impact and workload,

• more status,

• more money, and

• more expected industry experience and seniority

The three level system you propose might work if the company using it is in fact actually flatter in its internal hierarchy on those traits - but if it’s not, from the perspectives of the workers, all you’ve done is intentionally obscured the mechanics of the actual hierarchy you’re using in a way that even further obscures pay disparities, denies workers who are motivated by externally visible status a route to progression, equates high performing “just below architect” and “barely above software engineer” workloads in a way likely to incentivize many seniors to coast, and surrenders an easy tool for gauging performance by measuring how successfully someone is progressing at the company and in the field based on their level vs. years of experience.


> The architects should be rare

This reminded me of my time at iHeartMedia. The company had an entire department of non-coding architects. They produced so many Visio diagrams that the company had to purchase a product that indexed these documents so that they could be searched.

The amount of busywork that was produced still takes my breath away.


The IC leveling at Apple is similar to what you have described. ICT 3/4/5 roughly maps to what you have. The problem with this is that while it's relatively easy to go from a 3 to a 4 after a few years, it's a huge jump from 4 to 5, and then after that you've basically peaked for life. If you just care about building great things, it can work, but for many people, feeling like there's growth and not stagnation is a problem. It also creates a problem for managers having to temper those expectations.

Limiting IC layers doesn't remove all the other politics.


Not really true anymore unfortunately, and the system is converging with other large organizations.

For example, to keep compensation in line with market and reduce attrition, there were definitely more people who got promoted to ICT5 last year and the designation has thus been diluted.

Furthermore, candidates coming in from other large companies expect the title/leveling prestige in many cases, and ICT5 is a tough sell while trying to hire a Google L7/8. So Apple does have a fair number of ICT6 from that.

I also think limiting IC layers and keeping the above politics minimal can work - Netflix did that for a long time and was very successful as a company with their approach. Most IC at Netflix were simply titled "senior software engineer", with pay being a wide band dependent on market value. They no longer do that for some reasons, and have adopted the standard large level hierarchy.


Fair number of ICT6… would you say more than 250 out of what, 25K engineers?


At Palantir, there are no levels at all, everyone is just a simple software engineer. It's their relevant experience that gives an engineer leverage on a project.


I worked at a place like that before. A lot of people were grumpy that they were not progressing. Some of them were probably right; some were probably wrong. But it was clear to me that having a concrete thing to aim for (a new title) mattered. A lot.

I take it that wasn't the case at Palantir. Since I'm convinced that all humans are status-seeking, I'm curious how Palantir satisfied that urge, if not w/ title differentiation.


Is it bad to discourage status-seeking behavior, such that the people are just that way... look elsewhere?


I don't believe in good or bad wrt stuff like this. You're welcome to try to discourage it, but success would depend crucially on how many people really are not "that way." I don't think I'd take that bet, personally, but it's possible I've lived in a strange sub-world and have met unusual people.


Can you, as a business, afford it?


Probably? Is it really the ladder climbers or the everyday folks doing the grunt work?


> In my current company there are 7 or 8 levels (depending how you look at it) for ICs. Why?

Probably same reason hn has points count beyond 500


I agree with you that promo culture is pretty much the root of modern big company rot, but under your system, how do you propose an industrious, ambitious IC move up, whatever that means? Even if you think the status is fake and dumb (and I would agree), how do you make more money as an IC?


Exactly - what does it mean?

You work on stimulating things and get paid what you accept.

You’ll have a yearly review where you have your say, and pay can be increased if employer think it’s worth it.

Titles are problematic in teams working to produce good stuff!

We should compare ourselves to athletes more than the MBA powerpoint version of a company.

I hear Tom Brady is a legend - is he an S10 quarterback? Senior quarterback? Or just a quarterback with high a salary based on performance?

I believe levels are instruments for managers not employees.


I agree with you that a level system creates bad incentives for everyone. But what they do provide is a rubric for the ambitious 23 year old who wants to climb to the top of whatever group he’s in. If you get rid of levels, what’s the new hierarchy for him to climb? If you don’t specify a hierarchy, one is going to form anyway, and then it’s even more out of your control.

If you’re going to say the hierarchy should be based on people doing real work, well I agree, but that is pretty much impossible to measure in groups beyond a certain size.

This groups beyond a certain size problem is the root of the problem, not levels, which are just a stopgap measure to slow the brakes on the deterioration and degeneracy that you find in any gigantic group of people.


Everyone who cares about Brady is watching him and knows how he is performing. This is not true of engineers.


This is why I’m stating it’s a management tool.

Poor management requires boxes.

Better management actually care and help employees perform and improve.

In the end it’s about culture.


Impact? Someone could be an amazing mentor and mediocre at actual output, but their impact could lead to a bunch of people being able to stay leveled up and think more efficiently that lets the team ship more with less bugs. Maybe there’s a senior engineer who can consistently crank out high quality code and identify optimizations that save money.

Some sort of holistic 360 reviews by peers where others can vouch for you could work. What’s the big deal with having one senior paid 180k and another 350k if their impacts are considerably different? I have personally dealt with BS like “we can’t promote you rn but also can’t give you a raise because you’re at the upper limit of your salary band”. That’s not fair, imo.


How do you stop the measurement of impact from degenerating into exactly the same situation we have now? In fact, I’m pretty sure at many bigcos “impact” is already the official metric for promo.

The real thing you’re getting at is that people should only be paid and promoted for real work, and not for bullshit work, but that’s basically impossible to measure once organizations grow beyond a certain size.

Also, in orgs beyond a certain size, “holistic” peer review based evals make bullshit promo culture worse, not better. It basically ends up meaning how much people like you, and accelerates the shift toward a culture where people optimize for that instead of doing real work. And then you’re back at square one.


I remember finding out some companies had SO many VPs. how can you have that many VPs? isn't it like making everyone a senior engineer?


In investment banking, that's just a mid-level title. The usual progression is Analyst, Associate, VP, Director, Managing Director.

Not everyone becomes MD, but you definitely become VP by sticking around for a few years.


At my current job between the CEO and me the reporting chain had one SVP and three VPs.


Vice President right? Weird naming considering it makes it sound like there should be 1 and they are in position #2 of all people.


Your description of 1997 Apple sounds uncomfortably close to modern Google. I miss working for startups.


That was also echoing through my brain while reading through the comment (worked there awhile ago)

It would be crazy if someone just shook up the whole company top to bottom, but google is still making money and apple wasn’t

So I think it will never happen


Good point, it's not a good comparison.

> it will never happen

I can see it happening. They have just one cash cow in advertising, and it's not terribly secure. TikTok is already a serious threat to YouTube. If this became a crisis, it wouldn't sink the company, but it'd certainly shake it up.


The "TikTok threat" is so overblown. It's not activity competing with any existing social or video platform. It's competing for time, but that's about it. No one is using TikTok to stay in touch with their Grandmother or watching 2 hour Podcasts on it. They invented a new type of video entertainment which has a social element, that's it - it's neither a YouTube or a social network competitor.

I'd argue Snapchat is probably their most direct competitor, but Snapchat's DAU is still growing fine, because even in Snapchat's case I think you can argue they're used differently. Snapchat seems to be used primarily to stay in touch with friends where as TikTok seems to be used to find/share short-form video content with strangers.


It directly competes for some use cases, for time, and for ads. One person's total time spent watching videos doesn't grow just because there's a new platform on the block. Kids even use TikTok as an alternative to Google searches for knowledge. I'm not saying TikTok alone can wreck Google, but YouTube at least felt threatened enough to make something similar called Shorts.

IDK about the social network side. Google doesn't really have one, so it's whatever.


> Kids even use TikTok as an alternative to Google searches for knowledge

Is that why Google search heavily forces tons of Youtube results to the top now even when it is barely relevant?


Maybe not cause of TikTok directly but because subtitled videos in general are an easier format on a phone compared to scrolling around search results and dealing with 2-5 popups (including the cookies agreement) on every site. Web browsing is much easier on a PC, but that's the minority use case.


> compared to scrolling around search results and dealing with 2-5 popups

Queue the four minute youtube ad.


Yeah there's that too. I think TikTok simply has fewer ads.

One of my friends described tech platforms as following a cycle where they're popular while they're loss-leading, but eventually they try to reel in profits that are actually proportional to their valuation, and everyone switches to the next thing. This implies that later investors are duped into holding the bag for something that's not worth the investment. Idk if he's right, but given the tendency for tech to ride bubbles, he could be.


Nothing is forever.


It's very much apples to goggles - google is gigantic, mature company with some problems and one that makes occasional missteps. It still makes many things people like, including truckfuls of money. 1997 Apple had a very enthusiastic userbase, small and dwindling market share, mostly terrible products and was teetering on the edge of extinction.


It's more how I feel about the SWE culture, not the company's fate. You're right that they're very different scenarios, including the fact that it's not 1997 anymore.


It's just a fundamentally different thing to be working for a company that is not a burning shipwreck. It influences culture and everything else much more strongly than some of the organizational parallels (although I think these are pretty weak as well). Imagine what working at Google would be like if Google was a losing search engine company.


Depends how it's losing. Could be like Yahoo! who just accepted its fate and sank as gracefully as possible, or could be like Apple who fought back.


Maybe even modern Apple, as if Apple just sort of reverted after Steve Jobs passed away.


Hmm, doesn't feel good having such a negative comment rise up this much.


> Engineering was all posturing: great hand-waving plans papered over nasty middle-management infighting. Hundreds of engineers in new glass-walled offices, producing plans.

Sounds a lot like a lot of FAANG companies today.


Sadly just sounds like ... human organizations often get.


The iron law of bureaucracy:

Pournelle's Iron Law of Bureaucracy states that in any bureaucratic organization there will be two kinds of people:

First, there will be those who are devoted to the goals of the organization. Examples are dedicated classroom teachers in an educational bureaucracy, many of the engineers and launch technicians and scientists at NASA, even some agricultural scientists and advisors in the former Soviet Union collective farming administration.

Secondly, there will be those dedicated to the organization itself. Examples are many of the administrators in the education system, many professors of education, many teachers union officials, much of the NASA headquarters staff, etc.

The Iron Law states that in every case the second group will gain and keep control of the organization. It will write the rules, and control promotions within the organization.

https://www.jerrypournelle.com/reports/jerryp/iron.html


I've always wondered what the Iron Solution to Bureaucracy looks like.

Usually it's smaller companies eating the big companies lunch via competition. But that usually takes a very long time (or competition is crushed by 'good intentions' aka gov policy). I know 'intrapenuership' and spawning isolated startups internally, where teams that are walled off from the middle managers of the larger company, was pushed by Clayton M. Christensen.

But the two areas that seem to be in a death grip with this problem are: modern western governments and monopolies (usually with market position enforced by said governments).

Whenever people push for reducing bureaucracy in gov they get accused of only wanting to help the rich or get crushed by the benefactors (lobbyists/NIMBYs/'local jobs' protection rackets/etc). And monopolies survive even when they are an organizational disaster internally, because where else can customers go?


Some degree of bureaucracy is necessary and inevitable in any organization, as it's basically the glue that makes a group of people an organization. Imagine how difficult it'd be to collaborate on a project with even just one other person, if you didn't meet / pass notes / info on what each you were doing on the project, at least once. That process of passing info, is what constitutes bureaucracy, and bureaucrats act as the gatekeepers of such processes.

So unfortunately, there isn't really an "Iron" solution to bureaucracy (depending on what you mean by "Iron"). The best you can do is work to minimize the amount of it (akin to optimizing an algorithm so that it's more efficient), but some level or amount will always remain, and that amount is largely related to how large the organization is.


Alas, like so many things in life we can't "solve" it, we can only minimize the negative, better ourselves, and learn to cope with it.


>I've always wondered what the Iron Solution to Bureaucracy looks like.

Open Source?

Specifically, removing the power to coerce.


Few if any people in an organization will admit that they are not devoted to the goals of the organization, and I think few senior people actually believe they are not devoted to those goals.

Further, it is difficult for many people in an organization to understand the contributions of people working in areas in which they are not expert, and thus it is easy to misunderstand whether they are devoted to the goals of the organization.

So I’m not sure this “Iron Law” can predict anything or help understand anything in a real world situation. It seems to boil to (roughly) “the bastards always win.”


Thank you for posting this, I hadn't heard of it but it jives with my experience in very large companies.


Probably like every large organization.


Can confirm.


[]


can you elaborate?


abxytg made a list of their experiences at Apple, but that list is empty.


You guys were creating Java?

> ~1,000 classes in 1.1, ~10K in 1.2 the next year.

9K classes in a year sounds crazy as hell


A typical well-designed class will usually be a few hundred lines of code, tops. Often less than 100 lines.

9k classes * 100-300 == 900k-2.7m LOC

Not TOO crazy for the most complete standard library ever developed, for a language meant from the beginning to take over enterprise business computing.

How many LOC are in whatever meme language is hype this year? Hell, I feel like I see nearly that many lines of console output, when I run NPM to pull in dependencies, lol.


I think much of the maligning Java gets is either due to people looking at badly written Java (2k LOC classes, factories of factories of factories, 15 arg methods, import * everywhere, etc.), or claims in error because they are not based on modern Java (claiming bad parallelism a la Hensen, citing a lack of modules, noting no functional support).

I think Java is a good language if your problem space is ok with the GC and some other quirks.

The problem space I've been in has issues with GC, so C++ or Rust are the main choices there (Go too, but not my preference).


I read comments like this all the time on HN and Reddit. But it's hard for me to separate real professional experience at day jobs, from hobbyist wishful thinking in personal side projects.

All I know is that I attend a lot of Java conferences and meetups, and I'm STILL waiting to encounter a flesh-and-blood human in real life whose employer really wavers between Java and Rust. That's like choosing between a Humvee and a submarine, their optimal problem spaces barely overlap.


>That's like choosing between a Humvee and a submarine, their optimal problem spaces barely overlap.

You'd be surprised. For a lot of stuff Rust is a good fit, aside drivers and OSes, Java is too.

There are succesful full text search engines written in Java (Lucene, and Elasticsearch on top of it), there are succesful event streaming platforms written in Java (Kafka), there are succesful distributed data processing tools written in Java (Hadoop, Spark), databases (Cassandra), and very very high performance trading engines (LMAX), very high performing servers (rapidoid, smart-servlet), and other such things besides, all categories of which have been quite big domains for C/C++ (and thus something like Rust).


Where I work some teams are using Java, and some are just starting to use Rust. Different problem spaces entirely though, as you said. I've seen Java used here for ETL work, Modeling & Sim work, web apps, and Android. I've seen Rust on two projects, both were real-time or near-real-time efforts.

The competition I am starting to see is between C++ and Rust, with the few teams looking at Rust loving it. I did recently help write a RFP response that noted software will be written in C++ and Rust, and the RFP was awarded to us.

I am in a different world though, as I am a contractor for the DOD. It's more like a lot of small to medium sized companies than one big company, which is why some groups can use Java and others C++ and others Rust.

I have the same problem of differentiating between hobbyists and professional experience, both on HN and when interviewing. I had someone tell me they knew Java, then found out they had done maybe 5 months of Java work and didn't know some of the most basic things.

As for myself, in terms of writing professional software for clients, I've done the following, in order of amounts worked: Java, C++, Python, Javascript, Erlang, R, Ruby, Clojure, C, Scala, and C#. I've done Rust and Elixir in personal projects, but not yet for clients. I plan to transition to a full-time Rust project in the coming years.


Seeing a company (as opposed to a person) choose between Java and Rust would paint a very bleak picture for their future.

To me it's an indication that programmers are getting to run wild past the technical domain, after all trying to hire for a Java codebase vs a Rust codebase is enough to kill a company depending on the space due to salary expectations and availability alone.


You can make a very good case for any organization to replace

Java => Go

C++ => Rust

The Database and System software that was created in 2000s very much was based on Java. But modern, performant Database and System software is written in Go/C++.

https://landscape.cncf.io/


I'd definitely take Kotlin over Go. That you can reuse the java codebase is a huge win, but even apart from that the language is a breath of fresh air.


Java has long been crippled beyond the regular GC limitations by lack of value types, but that’s finally, finally going away.


Explains why there are no production Java applications...


FTFY...

Explains why there are no high performance production Java applications...


Yeah, other than Kafka, Spark, Cassandra, Flink, Hadoop, Elasticsearch what has Java ever done for us!


Given there are hft firms built on Java…


That's 36 classes per *day* (900/250 workdays)

I don't know how many people were working on that

but when developing standard library you do not really want to rush designs. You want to be very, very careful about it.

Here's a book about it from one of the most experienced people in API/Standard lib design on the planet

https://www.amazon.com/Framework-Design-Guidelines-Conventio...


I can't speak for grandparent commenter. However, I can point out that the standard library for Java 17 has a TOTAL of 4,388 classes. TODAY. A quarter-century after the timeframe that they're talking about:

https://docs.oracle.com/en/java/javase/17/docs/api/allclasse...

So if ~10k classes were really written in a year, then I can only assume that most of that was the compiler or virtual machine runtime (i.e. C++ code) rather than the standard library Java code.

Regardless of how anyone feels about "verbosity" or "design patterns" in Java application code, the underlying JDK is pretty inarguably one of the most solid and impressive pieces of tech ever developed. I'm not throwing any rocks at that codebase.


>the underlying JDK is pretty inarguably one of the most solid and impressive pieces of tech ever developed.

I agree, JVM is impressive.


I think the first part of a standard lib is pretty straightforward to design. String operations, math, standard protocols, I/O are petty well understood. It gets more tricky once you get to higher levels abstraction like UI.


Nio and streams API show that even those were not evident from the beginning of java.


> ... pretty straightforward to design. String operations ...

Tell that to Haskell.

Also, I've seen some pretty gnarly use of String in Java.


This is the kind of comment I come to HN for. That's a powerful insight on work estimation.


> A typical well-designed class will usually be a few hundred lines of code, tops. Often less than 100 lines.

It’s not the lines that take time; it’s getting the code to the “well-designed” state, double so (if not more) for an API.


Complete maybe, but it was not fun learning or using swing in college.


Creating new classes is easy. Keeping the number down while still restricting them to a well-defined responsibility each is actually hard.


Sounds about right for Java..


How many classes do you think it should have, ballpark, given the same feature set? GUI toolkits alone can produce thousands of classes because there are thousands of concepts that make sense to model, then add on security, collections, many utilities, XML handling, multiple RPC systems and so on. It's hard to over-state how large the Java standard library is.

And it's not like other ecosystems do better. The standard complaint about JS is that you try to do something basic in that ecosystem and discover you have 10,000 modules in node_modules. Not so different, really, except that the Java stuff comes from a single team with a relatively coherent design philosophy.


If it is necessary to have that many classes, then there should be so many classes.

but in Java everything is a class, main function? in a class. a library of static functions? in a class.


With an alternative design you'd need to have some other code container concept, but it'd be almost exactly the same as a class file. If the JVM also had a separate notion of a non-class file, just hold to hold static methods, then it'd boil down to duplicating the format. Most languages decide to make the source file the first-class container concept, or in native languages, the DLL/.so, but this isn't obviously better and comes with its own downsides. Orienting the VM around classes has a bunch of upsides that contribute to Java being successful.

The real complaint here is more about the syntax boilerplate involved in writing top level functions in Java that map to classes with static methods. Kotlin shows there's no deep reason why the Java language has to be written this way, it's just notation. They picked it to avoid complicating the language with special cases, but I can easily agree that smarter syntax is more important than the Java guys have historically believed. Kotlin is proving this at the moment.


Sure, classes act as namespaces in that case. What pain point does it cause?


Does the AbstractFactoryFactoryInterface and AbstractFactoryFactoryImpl count as two different classes...?

Kidding aside, I always found Java as a core language (and the basic standard library for it) to be mostly fine, but the weird architecture astronaut cult that hovered around it and infected many of the commonly used libraries to be a real problem.

Maybe things are different now for the language, I haven't touched it in many years at this point.


Yeah it was never the language or the core libs which were the problem, it was mostly some of the frameworks which had the FactoryFactory problem


I disagree with what I call Java's OOP obsession, but I have to admire the drive behind that ecosystem.


I'm disgusted by it, but admire the ability for the system to run in spite of the mess that the pattern generate.


At least with lambdas and `var`, they yielded a bit on the old ways.


Sounds like a typical enterprise Java hello world type application.


The 90’s were a weird time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: