I've always felt like there were two types of frameworks. Both types make the first x% of work relatively easy, but then the two types differentiate based off of how easily it is to drop down and go custom.
A framework with strong primitives makes it easy to create custom functionality that is still consistent with the framework's conventions. But a framework that relies too much on magic makes this much harder. It's like all of a sudden you're surrounded with old Mainers saying "you can't get there from here" while you desperately try to cobble some kind of solution using ancient lore documented once long ago in a forgotten scroll.
The best frameworks are the ones that cater to users working at a range of abstraction levels. Akka is a good example of a project that aspires to this and largely succeeds at it. If you are working at a particular abstraction level and need to drop down, it's a consistently polished experience. I have never had a shocking moment where I opened a door that users weren't supposed to open and saw something that users weren't supposed to see. Everything is created to be seen, understood, and used. And even in the rare cases where it's not supposed to be used, it's still designed to be seen and understood!
It's funny to hear someone say so many nice things about Akka.
I work with Akka streams, actors and http every day and and while I agree that Akka is really well documented, incredibly powerful and as you say polished, it is also the bane of my existence. It has made what could have been a simple application a rather painful experience for newcomers and even very senior engineers. The complexity of a framework like Akka is a huge barrier of entry for engineers and hinders development more often than it helps.
I wonder if this is the type of complexity that comes by default with state machines? I have the same feelings with complex react component structures, as well. There is something about state machines where the command flow (that updates the states) feels "hidden", and I thus far haven't found a way to design or document this kind of programming cleanly. Instead I have copious documentation explaining what each combination of states means and how it can be triggered.
They get big, but it is not right to confuse size with complexity. A huge state machine (size measured by the number of states) can be quite simple. Both in the whole, and each state transition can be isolated and considered independently
I'm also very positive on Akka having been deep into Cluster and Streams and Actors over the last few months. The barrier to entry is high because the problems Akka is appropriate for are complicated. Every time I want to complain I imagine managing state in a performant way using "simple" threadlocks or Futures and then magnify that over a distributed application.
It's not going to be more performant than the constructs it's built off for a Hello World application. But unless you are a deep expert in concurrency and memory management (and have a team who can keep up) if you have a complex problem of concurrency and scale then the Application built in Akka will have high odds of getting to production readiness and stay maintainable while being performant and not getting bogged down in memory issues and shared state performance issues.
That difference in the odds of success increases by an order of magnitude again once you start talking about distributed data.
The most obvious case for Actors are usually anything where the time to process something isn't predictable and you need to manage state of that process. So the common example is Chatroom session management where some sessions might last 5 seconds and others might last for days. Likewise Games companies have built multiplayer gaming servers using Akka. Telcos and others are common users.
For me personally I'm using Akka to mangage the processing and reprocessing of a lot of entities through some algorithms where the processing time can't be known upfront and the data is unbounded.
In terms of wanting something to tinker with then I guess building a chat application backend is probably the easiest one to demonstrate the complexity of trying to manage that (and there are also some demos in the docs as well)
The other decent alternative for some of these problems are often some flavour of messaging queues and event sourcing. E.g for the chat system you could probably shoot messages through a broker and then send a "chat finish" message. The disadvantage of this is if your broker is under heavy load then your chat app will perform linearly worse under increasing load while using the actors and state mean you can have a lot more stuff happening concurrently without the problems of waiting for events to go through a queue hence the latency advantages. Let alone if your servers are ephemeral and changing constantly then that's just another level of complexity to manually roll to keep your session state going that pretty much comes out the box to a production standard in akka persistence and sharding.
A chat system is one of the first network programming exercises. You must be trying to satisfy requirements that are beyond basics. Assuming persistent chat rooms are required, do you really need anything beyond some tables in a relational database that model chat rooms/chats?
This isn't talking about a hello world level application this is talking about an application managing complex state at scale. Maybe some bytes sent between points representing audio at the same time across several different sessions. I assure you that the problem set slack, people working on Fortnite servers etc is not the same problem set as a network programming illustration even if it's the same start point.
If you're talking about Akka Cluster sharding, basically any situation where you have distributed objects that have their own state and behavior. I've had two projects where objects needed to stay in memory so they could receive messages, update their state, and then behave by sending messages to other objects. And it was too many objects to live in one server. The naive solution of data repositories and caching isn't really sufficient for this, and a horizontally scaled Akka Cluster Sharding setup with Persistence was the right solution given the requirements.
I guess when I've encountered complexity in Akka, I've felt that it was in proportion to the difficulty of the problem I'm solving. In the future, if/when I get to work with frameworks that solve the same problems while exposing me to less complexity, then Akka will be obsolete in my mind.
Unlike, to use an example that is close to my heart, Atlassian @atlaskit, that while open-source, was clearly not meant to be used outside any Atlassian app.
> A framework with strong primitives makes it easy to create custom
> functionality that is still consistent with the framework's conventions.
> But a framework thatrelies too much on magic makes this much harder.
This is what Laravel got right. It uses "Facades" (which differ from the Facade pattern) for the magic, but you don't have to use them. So socpers (Stack Overflow Copy-Pasters) can copy and paste their Facades without understanding them, but when I need to fix some edge case or write some functionality the framework designers did not anticipate, I can do it by reaching for the lower-level objects.
I've got tons of criticism of Laravel, but this is one thing that was done really really well.
I personally categorize frameworks into heavy and light frameworks.
I don't have very good explanation what I mean by heavy/light framework, this is the best I can do at the moment:
Heavy frameworks in my understanding are the ones that force constraints on your development process, application design, choice of other technologies and frameworks, on the types of products you can make with it, etc.
On the other hand light frameworks are like extensions of your fingers. They don't pretend to know better than you what you are trying to achieve. They just make a particular task easier for you.
The issue here is that heavy frameworks do not compose together very well.
You can safely have one heavy framework in your application.
When you start putting multiple heavy frameworks, they start colliding with each other and you start spending more and more time on resolving various problems that are not inherent to the domain problem that you are trying to solve but rather are completely accidental due to the technology choices you have made.
I don't agree that it's useful (not sure if you were being ironic). All of the "frameworks" I've used were determined to make my cog their subject. I haven't come across a framework that was content to be a cog in my machine.
Not being sarcastic. Let me rephrase, because I think we are saying the same thing. No framework was happy to be a cog. The ones that were I like to call libraries instead.
I say this point of view is useful because it seems like an elegant/simple way to distinguish them.
Composing light frameworks can make for a frustrating developer experience. Individually lighter, when combined, the end result may end up more "uneven" compared to the cohesion of a single heavy framework. VS Code is one example of the composition of lights approach.
Rails community sees this a lot. Rails is fairly heavy and there are some great lightweight solutions like Sinatra.
But what I’ve seen many times is that by the time you finish adding all the missing bits you end up with something as heavy as Rails, but non-standard. So now maintaining it is a huge burden.
Another perspective is whether the framework is declarative or imperative.
If your code describes what is to be done without making the motions itself, you're using a very heavy framework or, honestly, DSL. These types of frameworks have lots of magic built into them and favor convention over configuration.
If your code performs actions imperatively, and moreover if it's explicit rather than implicit, then it can be easy to wrap or replace the framework. It's like a library.
But some frameworks allows you to have both convention and configuration. In ASP.NET you start with convention, which is good for the development speed. If you need to replace a functionality provided by the framework or do something in another way, you can do that with ease.
I've always liked Django's balance of declarative and imperative. The former gives you a clean and terse approach for the common use cases but there's always an escape hatch into the latter.
I think you want a framework that is heavy but swappable for stuff that is critical and easy to get wrong (e.g. authentication, http header parsing, input validation, routing, preventing sql injections etc) most of these things are "finalizing" concerns anyways, so you don't really need them to compose.
And light for everything else (e.g. authorization, operations, templates, etc).
Being able to swap things out is one way to make a framework lighter.
Light ones: if I have a problem with the framework, I am guaranteed to have a local solution without getting rid of entire framework. For example, if Spring Framework does not let me create an object in a particular way, I can write my own function to create the object. If the way configuration is being located does not suit me, I can replace the mechanism without getting rid of Spring Framework or even configuration -- just by implementing couple of interfaces.
Heavy ones: if I want to do something that has not been planned for by framework creators I can potentially run into problems that I have no way out of without getting rid of it all.
An example would by systemd: it is built in an unnecessarily heavy way by insisting on you using it all. For example, if you don't like journald or can't use it for some reason -- tough luck. The only way to get rid of journald is to get rid of entire systemd.
Maybe this was so in 2013-2015, but React today does a lot of things (module loading, component rendering, basic state management, scheduling, memoisation, caching), and will soon do even more (server components in some custom json format, rendering server). Gone are the days when React was a simple "view library"
React doesn't load modules, you need a bundler for that. It just supports a component type that blocks rendering until a promise containing a component resolves.
I agree that it is arguable, but I would say that React is a framework for this reason: usually you do not decide when to call React code, instead React code decides when to call your code.
> React does one thing. It’s arguably not a framework at all.
But that's React's failure, too. The lack of officially-supported libraries for common things such as routing leads to the horrible situation of having multiple competing, each-with-different-inconsistencies implementations of routing and moving from one React project to another transfers very little skills compared to other libraries such as Vue.
As someone currently using Rails, I would say React is nothing like Rails. Rails is just magic everywhere, where as React feels very explicit about what it does.
What parts of Rails are magical to you? This is an exaggerated stereotype imo. Spring framework felt way more "magical" than whatever Rails is doing to me.
You can look at the Rails docs and it's all explained pretty clearly to me...
I went through a cycle of: study ruby -> start using rails -> seems magical -> realize you don't know ruby yet -> study more -> rails no longer feels magical. I suspect this is typical given that there is lot more to ruby than appears on the surface and rails heavily leverages the metaprogramming capabilities.
It's more analogous to an alternative history where the Ruby framework most people used was Rack [directly], not anything built on top of it (ie Rails).
The beloved and overhyped Next.js which I dislike with a passion is of the second type. It goes as deep as the simple examples from its documentation, and no deeper than that.
To be fair to Next.js, this is largely because its primary value is a Webpack config. Once you need to customize the build, you’re not customizing Next.js anymore, you’re working directly with Webpack.
From my experience, the libraries and frameworks that have well designed primitives (and ideally primitive interop within the community) are the ones that generally “win.”
And also my experience, very few libraries and frameworks are built on reusable internal primitives. I just tide time until those frameworks die so I don’t have to learn them.
Seems determined by how good the underlying primitives are.
If the framework needed a completely different abstraction to make it easy to work with, dropping down to a lower level will be incompatible, difficult and confusing.
Isn't that deliberate? Isn't it even in the name? "on rails" - you are supposed to be stuck on the track laid out for you, but the benefit is that you can move more goods with less effort as long as you do it on the given paths.
Yes calling Rails magical is a bit lazy imo. There is no magic and everything is documented to the point I would say it's actually quite a beginner friendly framework. You can say you disagree with some of the Rails APIs/conventions but that's very different than "magic". I think a lot of the complainers are people who just don't want to bother with Rails, they are not making the minimal effort of reading most of the docs. Maybe there was one Rails legacy project on their job that was a mess and interfered with their efforts to learn Node/React/Cool new stack. If that's your mindset then yes Rails is very "magical".
A framework with strong primitives makes it easy to create custom functionality that is still consistent with the framework's conventions. But a framework that relies too much on magic makes this much harder. It's like all of a sudden you're surrounded with old Mainers saying "you can't get there from here" while you desperately try to cobble some kind of solution using ancient lore documented once long ago in a forgotten scroll.