I have a feeling desktop development is going to slowly revert back to classic OOP paradigms just like the web world detoured from MPA -> SPA -> SPA+SSR -> back to something that looks a lot like MPA again.
Classic toolkits like GTK, Qt, UIKit/AppKit, Swing and JavaFX use OOP to solve some of the problems this article talks about.
However, this OOP model seems to be somewhat incompatible with Rust's strict memory safety rules. At least I think it is because I haven't seen any big efforts for a Rust OOP toolkit like this. Most of the existing Rust toolkits are immediate mode or ECS or something "reactive" that looks like React.
While I understand the idea of moving away from "unsafe" languages like C/C++ to something like Rust, I wonder if Rust is the right language for building UI applications? Maybe a better architecture is to use Rust for the core logic as a library to be consumed by a higher level language that meshes well with these OOP widget tree and event handling paradigms.
I think we need other UI models instead of everything mature being object-oriented-oriented. Doing UI work in FP (or FRP) is great in many aspects until you need to integrate with these OO models like the DOM, et.al. Direct integration or a first-class VDOM-like model would be a nice step. There’s a tangent issue with all of popular game engines built around objects.
Less, I would say; because using OOP for everything is straightforward, just obtuse, verbose and possibly slow.
You don't have to solve paper-worthy academic puzzles to use OOP everywhere.
Using OOP everywhere has been done and probably continues to in some shops. There was a time in the 1990's when nobody was fired for using OOP everywhere.
In Common Lisp, absolutely every value you can create has a class: the class-of function is meaningful for any value. You can replace all the defun forms in a Common Lisp program with defgeneric and defmethod.
What you're asking for is what Compose does (the new Android toolkit). Being pure FP though does have issues. There's no way to get a reference to anything on the screen, so some tasks that are basic and obvious in an OOP toolkit turn into a strange hack in Compose. For instance to focus a text edit you have to create and memoize a "focus requester", pass the "requester" as a parameter to the TextField function, then "launch an effect" and finally you can use the requester to request the focus.
Compare to an OOP toolkit: call edit.focus() when the UI is first displayed and you're done. The reason you need a requester in Compose is because everything is FP and lacks identity, even though there actually is object identity buried deep inside, it's just hidden from you.
The FP model does have some benefits, but I think OP is right and we'll end up with a less radical mixed approach.
If something doesn't exist, maybe the reason is "it's a poor fit".
If you're going to assert that FP is a better fit for GUIs, you need some demonstration of that, especially since it's been decades since this meme started.
Classic toolkits like GTK, Qt, UIKit/AppKit, Swing and JavaFX use OOP to solve some of the problems this article talks about.
However, this OOP model seems to be somewhat incompatible with Rust's strict memory safety rules. At least I think it is because I haven't seen any big efforts for a Rust OOP toolkit like this. Most of the existing Rust toolkits are immediate mode or ECS or something "reactive" that looks like React.
While I understand the idea of moving away from "unsafe" languages like C/C++ to something like Rust, I wonder if Rust is the right language for building UI applications? Maybe a better architecture is to use Rust for the core logic as a library to be consumed by a higher level language that meshes well with these OOP widget tree and event handling paradigms.