Hacker News new | past | comments | ask | show | jobs | submit login

Cautionary tale for the ‘rewrite it in Rust’ camp



I think it's mainly a cautionary tale for the "let's write a browser engine" camp.


If anything, it’s an encouraging tale for the “let’s invent a new language for our rewrite of a complex application” camp.

The rewrite failed but the language lives on because the problem was general enough.


> The rewrite failed

Servo code, written in Rust, to enable the use of multiple CPU cores to speed up rendering was merged into Firefox quite a few years ago.

https://hacks.mozilla.org/2017/08/inside-a-super-fast-css-en...

Also, shout out to Lin Clark, whose blog posts for Mozilla back in the day set a high bar.


Fair enough. My impression was that Servo didn’t meet its original goals. Whether that counts as failure is a mindset question, I suppose.


> The rewrite failed

What rewrite are you talking about?

Servo was never intended to replace Gecko, so it can't be that.

Servo was always in Rust, so you're not talking about a rewrite there.

Servo delivered on its promise to be a sandbox for experiments that might end up in Firefox, so surely you're not talking about that either.


Some of the best things in Servo were taken over by firefox, weren't they?


Yes.


Most programming languages were born out of such a need, not necessarily just the successful ones.


Rust is doing just fine. And your original comment is specieous at best.


Was about to write it's a cautionary tale for "let's procrastinate to write x by going meta and bikeshedding on tools and languages" camp but pavlov beat me to it, interpreting what he wrote as sarcasm.


Depends. It made Forefox lose marketshare but it also gave us Rust.


> Cautionary tale for the ‘rewrite it in Rust’ camp

Mozilla, famously, made multiple attempts to update Firefox's rendering engine to take advantage of multiple CPU cores that had to be abandoned before they switched over to Rust and started to see some success.

>Parallelism is a known hard problem, and the CSS engine is very complex. It’s also sitting between the two other most complex parts of the rendering engine — the DOM and layout. So it would be easy to introduce a bug, and parallelism can result in bugs that are very hard to track down, called data races.

https://hacks.mozilla.org/2017/08/inside-a-super-fast-css-en...


Rust may have helped but I doubt that's the deciding factor. This needs more context


There is already a link to an interview with Josh Matthews, who led Servo development, where he makes the case that moving to Rust from C is the factor that finally allowed the effort to succeed after three previous failed attempts.

https://news.ycombinator.com/item?id=36093636


Reminds me of the famous “you have to be this tall to write multithreaded code” poster :)


Not really. The Servo project delvireded som amazing results which made Firefox's rendering much faster.


Firefox has been consistently slower than most other browsers for the last fifteen years.


That's just not true. Over 10 years ago I switched to Chrome because it was much faster but it's been nearly 5 years since I switched back because Firefox was blowing it away


Nope, that has always been highly workload dependent.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: