Is having 1-3 Mac guys and 1-3 Windows guys (which is realistically all the vast majority of apps would need) really that much more expensive? To me it seems the real attraction is being able to hire from a giant pool of cheap, generic, interchangeable, fully replaceable developers.
> from a giant pool of cheap, generic, interchangeable, fully replaceable developers
A giant pool? Sorry, not really. JS skills are highly sought after, and finding good, experienced front end developers is very hard. Inexperienced developers are easy to find in any language, including your favorite one, whatever that is.
The "cheap, generic, interchangeable, fully replaceable" part just sounds bitter and vindictive, no need to respond to that.
When compared to the hiring pools for iOS and Android devs, the size of the web front end pool is certainly larger, because that’s what’s currently hot and has the lowest barrier to entry. You’re right that quality is just as hard to find, but given the performance issues that continually plague most modern web apps (with apps like VS Code serving as evidence that it need not be that way), I’m not convinced that quality is the highest priority with these hires.
I will concede that I am a bit bitter. It’s frustrating to watch the industry push ever harder to render specialized developers unnecessary.
If you compare the demand for websites vs mobile apps, we have about a total of 5 million downloadable apps and about 644 million active websites according to netcraft, which can be interpreted as 131 web devs for every 1 app dev. The pool for webdevs should therefore be much larger.
As for quality, correctly accessing a devs level of competence can be very tricky, regardless of the required skill-set. With web devs, its even more tricky because an "over-qualified" desktop/app devs can do just as much damage in terms of technical debt as a poor quality web dev. A webdev is also a specialized developer, but due to the low entry barrier, many don't realize this.
You can reasonably easily share the bulk of your mac/iOS/tvOS/watchOS codebases, and apple is actively working to push that percentage higher. So really, you’re not looking at 1-3 “mac people”, but rather 1-3 “Apple people”.
Android dev is messy business, no question there, especially if you care enough about experience quality to test across a broad range of phones, OS versions, launchers, etc. It’s no wonder why the average Android dev team is 2x the size of the average iOS dev team. Not much you can do there but hope Google makes an effort to clean up (and it seems they are, albeit slowly).
Even with all these platform specialists accounted for, companies like Slack and Spotify aren’t hiring any more engineers than they would otherwise. The only difference is instead of hiring specialists, they’re hiring hordes of web front end.
When there were way fewer personal computers and software was much harder to develop and distribute, companies would write software for Mac, PC, Amiga, Atari ST, and even Apple IIgs.
I think this is a fair comment but Chrome has been doing this a lot lately. They'll make a change and developers must scramble to fix their websites.
It's the same with autocomplete earlier this year. One day Google decides to ignore autocomplete="off" and all hell breaks lose.
Interesting to note, they have reverted this change. Google now respects autocomplete="off" in some scenarios (i.e. when autofill is not triggered via name attribute).
I don't know if it's the same thing, but I'll bite: it's absolutely maddening to edit or select part of a URL.
Click in the address bar and the entire address is selected, then you click on any part of it to either select a part or to place your cursor in order to add to it, after which Chrome appears to first shift the entire URL to the right in order to show the protocol, then it places the cursor within the shifted URL under your pointer. This causes (attempted) selections to be established from some other place in the URL than the user intended.
This is about Google having a fundamental weakness in product management and UX, giving me the equivalent of a Windows Registry setting to change is not helping, practical as it may be.
Its weird too because chrome invented to UX when closing lots of tabs not to reflow the browser UI but keep placing their close X under the mouse until you move away. So its like they understood this once and have forgotten.
Same thing happened with me at my old software development job. Company went through a merger but I left before it completed. I stayed for nine months, working from home, but with no work to do.
I did freelancing but eventually got very depressed. Quit my job, left my apartment and moved back with my parents.
Someone has to use the beta tools for real work, otherwise most edge case bugs wouldn't get caught. Yes, there's sometimes pain, and yes sometimes it isn't worth it. Usually it's fine.
I am perfectly capable of making a website work in safari but I am not able to test it because I don't want to spend $xxxx buying a macbook for the purpose of refreshing a web page. Firefox and chrome run on my dev machine and Microsoft gives out a free VM image with edge.
Potentially. Although in my experience safari works 99% the same as chrome so I'd bet a bunch of devs just assume if it works on chrome it will work on safari.