Hi. I'm the V8 engineer who implemented the fix and wrote the blog post.
The reason the initial bug report [0] was marked as WAI but the Medium blog post got a lot more traction is very simple and much more human.
When the initial issue was filed to the chromium bug tracker, it was routed to Chrome's security team. From a security perspective, Math.random() provides no guarantees about cryptographic safety, so naturally it was marked as WAI – nobody from the V8 team actually saw this issue. One of the suggestions "3. Make crypto.random(size)" was acted upon though, and so crypto.getRandomValues() was introduced in Chrome 11, about 10 months after the issue report. In terms of specifying and introducing new Web APIs, this was incredibly fast.
After the Medium blog post was published, someone filed an issue directly to the V8 project. It did not question the spec compliance, but pointed out that the PRNG quality could be better. That nerd-sniped me into researching this topic and after reading a few papers, I implemented the fix with xorshift128+. I'm thankful that my team lead allowed me to set aside some time to work on this even though it was not on our project roadmap.
Performance regressions that could affect our benchmark performance was of course part of the consideration, but there were many options to avoid a performance regression. In V8, crossing the boundary between machine code compiled from JS to C++ runtime builtins is fairly expensive. I could have either ported xorshift128+ to assembly so that this boundary crossing was not necessary, or I could have amortized the boundary crossing cost by buffering multiple random values. I chose the latter because porting to assembly is error prone and I would have to do this for each platform. There are better options in V8 nowadays, by expressing the algorithm in an intermediate representation, but back then that was not available.
I don’t understand. The page can’t access these things, only dev tools, so any action to expose it would still have to be mediated by user action; and even then, what’s so bad about exposing this? Everything in it is scoped to the document, and if it can expose things you don’t want exposed, then so can getEventListeners(), right? Yet getEventListeners returns an actual value. What’s the actual security problem of being able to list all objects on the JS heap?
I manage the team at Google that currently owns the Puppeteer project.
The previous team that developed Puppeteer indeed moved to Microsoft and have since started Playwright.
While it is true that staffing is tight (isn't it always), the number of open issues does not tell the full story. The team has been busy with addressing technical debt that we inherited (testing, architecture, migrating to Typescript, etc) as well as investing in a standardized foundation to allow Puppeteer to work cross-browser in the future. This differs from the Playwright team's approach of shipping patched browser binaries.
> The team has been busy with addressing technical debt that we inherited [...] migrating to Typescript
Wow, not writing stuff in TypeScript is now considered technical debt? I knew people were already rushing to rewrite everything in TypeScript if they could, but didn't knew we'd come this far along the hype-cycle already.
Yes definitely. I've worked at two companies in three years spanning 250,000 employees and both companies consider writing JavaScript deprecated in favor of typescript.
I used Puppeteer on a project recently to generate some really big and complex PDFs that would have been a massive pain to do any other way, so thanks for your work, and I'm very happy to hear that the project isn't dead.
Glad to hear that. Puppeteer still has a number of compelling things over Playwright (like not shipping patched binaries) so I hope competition in this space can continue to happen :)
"Each version of Playwright needs specific versions of browser binaries to operate." [0]
They patch and compile browser binaries so they have the functionality Playwright needs.
Their build of Chromium is one release ahead of what's out but it looks like one could maintain a library of older Playwright browser binaries to test with. They probably have an older Firefox 91 binary that's feature-equivalent to the current Firefox ESR. Their WebKit builds won't ever be exactly the same as Apple Safari.
For sure the blog is great! But I’m thinking something more along the lines of “here’s how you’d build something like this” or “here’s the stuff to read to get started on a project like this”.
While true, I was under the impression that there wasn't a cross-domain cache that wasn't opt-in. Again, though, maybe this is per-domain so it's moot.