Sure, you may well end up writing one, but it's likely to be a fraction of the size. My current entire app is running at half the size of jquery, which is getting me awesomely quick page loads even on poor mobile connections. The dev time was longer, but not crazily longer, and the surface area for testing is somewhat smaller
There are cases where some companies have the business justification to work on writing their own frameworks, like with Facebook. They can use up human resources to really focus on latency at the cost of other metrics.
Other companies juggle their priorities differently, and would probably not want to focus their efforts on replacing React, which has a small API, constantly-updated documentation, a team that responds to issues, and widespread attention and thus lots of discourse. I also think the React layer abstracts away complexity, trading total lines of code for cognitive easiness. Some teams have the business justification for doing things their way -- but I think most don't.
Unfortunately I have to disagree. The JavaScript Engine still have to parse and interpret the code. This can take a bit of time, especially on mobile devices.
Parse time shouldn't be very long at all, especially compared to network latency. How long does it take for large (~100-150kb minified) to be parsed and interpreted? I'd be surprised if it's more than 100ms
It's barely above the threshold for detectable live feedback latency, from a musical instrument or sight/sound correlation. As a user veiwing a page with plenty of other latency, it's inconsequential, particularly since you can execute your js while image assets are still incoming.
Maybe, but in terms of understanding the code and having total control over it, it means a lot.
(And I don't buy the large teams can't adopt a self-grown code in the first place. If they're any good, they can. If they're not, then even their React and Angular skills will be bad).
Besides you know all your business logic code? That will be custom too, anyway,and they'll have to adopt it...
It's not so much the time over the wire to download, but the time to parse and run all that JS. If you open up your web dev tools, I think you'll see the impact from a 50k gzipped JS file is probably greater than a 500k image, for one thing the site can probably function without the image loaded, but your page will not be user responsive until that JS has finished executing.
My total image payload is 30k. Obviously getting there is more challenging for other sites, but at the end of the day if your page is >1mb I'm probably not going to wait for it to load
What's more relevant at Facebook's scale is dollars saved per kilobyte shrinked. Because majority of its users will tolerate almost anything at this point, including apps that crash randomly to measure loyalty.