Hacker News new | past | comments | ask | show | jobs | submit login

Very few apps that use React require sustained, fast DOM changes for a long period. Most web apps sit idle 99% of the time, and then need to manipulate a handful of DOM elements based on a user action - the user clicks a button or types in a box and a few things around the page update. That means the framework needs to make 4 or 5 DOM changes in one frame (about 10ms including the browser's overhead to render things.) The fact that this is slow in some web apps is beyond disappointing. They're doing practically nothing and they still feel terrible.

This is where I think React is getting things right. The work that's been done on React 18 attempts to work out what events are most important, and schedules the DOM changes from those interactions ahead of changes from other events. It batches the changes over a number of frames if there's a lot of them. This means that a UI made with React will probably be slower than other frameworks, but it'll feel faster. The changes that result from your interaction happen first. That's what users want.

Ultimately every framework has an upper bound for performance, and if you're not hitting it then the framework speed doesn't really matter. If you are hitting it though, then React's approach is better because it optimizes for the bit the user cares about. The fact that Solid, Svelte, etc are technically better, and therefore faster, means there's lots of additional headroom for using that speed, but once you actually cross the threshold of what they can do things will start to feel slow very quickly.

And this is where that matters - many web developers just aren't great at writing code, so if the framework can scale a 'fix' for what they build that will result in a better user experience than simply giving the developer more speed. A faster VDOM is a good thing, and no VDOM at all is an even better thing, but ultimately you could make the fastest framework ever and some developers will still write things in ways that feel slow.

The right approach for fixing UI on the web is to make a framework that focuses on doing the important DOM changes first, even if it does them a bit slower than the other approaches.




I work on a gigantic react app...It would take me days to explain everything this app does..and we've not once had performance concerns with react..so I'm really wondering what the hell kind of apps everyone else works on that they have so many issues with it.


I have not explored why, but facebook.com seems incredibly slow.


Been a while since I worked on WebApps. I agree with your sentiment that a lot of up front and then small changes over time. VDOM and Just refresh the whole page seem like consequences of how we express our DOM construction.

I'm curious if anyone is working on and had success with using differentiable (in the math sense) expressions of dom construction from state/events in order to allow the runtime to easily calculate diffs given state changes/events.


Slicing big updates and prioritizing some of them could also be achieved by tweaking the reactive core of Solid or Svelte.


Of course, but until those libraries actually do it I don't see the value in saying so. Any library could do it.


My point is that this feature is unrelated to vDOM, it happens to be implemented by a major vDOM lib.

But... I actually read your comment too quickly, you were not making that point :-)


Why do you think slow web apps are slow?


Poor/no architecture. On the client side - requiring multiple API calls that take tens of ms to respond,doing them serially, trying to mask this with loading transitions that take up to a second that are hit multiple times during common use.

On the server side, a pile of microservices that are designed around team responsibility, with requests that require requests that require requests to respond to common API calls. The desire to write the backend in JS causes a very low perf ceiling on a single instance which means a medium sized web app needs a dns lookup, load balancer hit plus reverse proxy in place for (m)any of the API calls, even if they're internal/trusted.

These apps are tested and benchmarked on the highest performance professional computing devices on local networks with gigabit connections, and then deployed to 5-10 year old computers running on 10Mb connections shared between 4 people. The servers are deployed to a large number of low cost cloud instances running on a virtualization layer inside a virtualization layer on "enterprise grade" (read: slow) hardware with real world disk and network speeds that are orders of magnitude slower than what is used for testing.

Pick any/all of the above!


It's comforting reading this. Too often web aps are developed thinking about ease of development/developer convenience alone, on fast hw, with fast and reliable network.

If those same developers used a 10 years old PC and a semicrappy 4g netwrk I bet the overall quality of the products that comes out would be ten times better.


I don't think that extreme is true either. I work in games and we have access to workstation hardware even though most of our players will inevitably end up playing on substantially lower end hardware. We set performance targets for subsystems, have thorough profiling available, and regularly _test_ on consumer hardware to gain the above metrics and work with them. That can be (and often is) done.


I think game devs are on the opposite side of the spectrum from web devs when it comes to performance testing, though.

PC gaming has always been performance-driven, and performance variability/customization has been part of the end-user experience since the first 3dfx cards and graphics settings screen (and before that too).

Web... until recently it wasn't really a consideration for users or devs, because everything was just basic HTML and CSS and it was big images or videos that was the problem. Then, within like a decade, suddenly all these JS-heavy frameworks took over and everyone jumped on board, and connectivity has struggled to keep pace. Web developers (the humans) too struggled to keep up, with everyone having to relearn the framework du jour every year or two, with all the optimization techniques of the previous generations thrown out or made irrelevant by new frameworks & browser optimizations. I've never met a web dev who seriously even considered performance beyond some superficial metrics -- I've never seen anyone use the profiler in Chrome or their IDE at all -- much less knew what to do about it even if they did. It's just not really a thing, at least in the small to med business space. Maybe if you're working in big tech or framework development that's different, but otherwise, performance is near the bottom of considerations for web dev. Which is why we have articles like this every once in a while... it's actually newsworthy when people go "hey, JS is slow again, here's technique #33 million to speed it up", to which most of us will go "oh, that's nice, but I can't replace my whole stack just for one speedup, and besides, this new thing is going to be obsolete by September anyway."


You are saying that web apps are slow because of the server, when we all have seen web apps that are laggy even when showing animations when you hover over elements. Users with slow CPUs exist!


No, I'm saying that web apps are slow because there is no thought put into the architecture of the application, and the structure of the app likely matches the structure of the team, not the design of someone who considered holisticly how it would work.


>laggy even when showing animations Yes, often because those web apps use a javascript like popmotion instead of just plain css transitions.

Or some dev writes an onScroll or onHover function with zero debouncing.


Mostly due to developers writing code that synchronously waits for network requests to complete before updating the DOM. The user clicks, then nothing happens while the app sends a request, and app only updates the DOM after the request completes. It's that 'nothing happens' step that makes things feel slow. They do other things like rerendering part of the app every time any part of the state changes rather than limiting the updates to just the bit that matters to that component too, so the app is doing a ton of unnecessary work (React helps here; it ignores state updates that don't change anything).

There's a lot of problems in web apps that make them feel slow, but they mostly distill down to developers following some bad practises that are easily avoidable. Web apps that are slow because they're maxing out what the framework is capable of are very, very rare. Making a faster framework won't fix the ones that are just coded badly. Making a more intelligent framework might.


> The user clicks, then nothing happens while the app sends a request, and app only updates the DOM after the request completes

I've not seen this too often frankly - that tends to result in a "hung" state for a UI where the app appears to not respond. The most frequent issue I've seen is long transition animations (500+ms) to mask network calls, even when they're not necessary.

> Making a faster framework won't fix the ones that are just coded badly. Making a more intelligent framework might.

This x1000


React does everything wrong: vDom, components rendering multiple times instead of once then reacting to changes, manual memoization techniques. Hence why Svelte and Solid are not only have a better dev experience, they're also much, much, much faster and eventually will take over React as it will go the way of jQuery, Knockout, Backbone, etc


Svelte is already 6 years old and nowhere near competitive with React in terms of popularity


Anything that has me write in templates in HTML like it's PHP is a no go for me. JSX reigns supreme for a reason.


I've built apps in both react and svelte, and I personally don't see a significant difference between them HTML-wise. .map vs each isn't a big deal, especially when you're usually constructing whatever you're iterating over anyway, etc.


Is JSX the prime driver of React's success? Even, ignoring the big corporate backing, I'm not sure this is the case.


So… JSX isn’t another HTML templating mechanism?


JSX is pure JS, unlike other templating languages.


What does this mean? It's certainly not vanilla JS/ecmascript. It's closer to XHTML than JS.


JSX is just syntactic sugar for React.createElement [0]. This means that I can write any valid JS I want in there. For example, if I use a functional programming library with a match statement, I can have that in my JSX, I don't need to adhere to what the templating authors came up with.

[0] https://fettblog.eu/jsx-syntactic-sugar/


...but what does React.createElement compile down to? HTML and JS. Whether you do it with a clientside runtime or a serverside render, it still gets compiled or transpiled down to DOM objects in the end.

I think of "pure JS" as something more like a standalone node function that takes an input and gives you some abstract data output, vs templating code whose main purpose is to define elements in the DOM.

That you can intersperse with JS with DOM-like props in a JSX component (styles, states, handlers, whatever) doesn't mean that JSX isn't a templating language. It's just one that also accepts inline JS. Hell, you can do that with PHP and heredocs/template literals.

Aren't all templating languages "syntactic sugar"? Isn't that their point?


Yeah, it’s (language) integrated with a convenient HTML syntax… just like all the HTML templating languages.

(language) could be JavaScript as far back as 2000 as far as I know (ASP) — probably earlier and I just hadn’t heard of it.


SEO


Your choice of framework should make very little difference to SEO. SSR, SSG, ISR, etc will have far more impact.

Dont forget that Google's crawler is literally Chromium these days. https://searchengineland.com/google-will-ensure-googlebot-ru...


Page speed is key. You can see in search console how changes to page speed effects how often a page is re-indexed and how many pages are indexed. For some sites that does matter. It basically comes down to how much money Google will spend on your domain.


If you need your site to be indexed by Google it should be very close to plain HTML and CSS with pretty much no JS that changes the DOM at all. The Venn diagram of 'pages that need fast DOM updates' and 'pages that need to be indexed by Google' should really be two separate circles... That's why I suggested SSR or SSG.


We use SSG for most new projects. There are many sites out there though that are just React + backend API. They may benefit from a faster vdom.


There are very few React+backend websites that need the content to be indexed by Google. That part is important. The overwhelming majority of React code is sat behind a login page that Google can't get past.

Anywhere an app is serving public content using React it should be using some sort of server side generation with hydration and progressive enhancement, which entirely negates the need for a fast VDOM for SEO reasons.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: