There is definitely a significant lack of allergy to complexity among today’s software developers. Complexity is very bad, and it is usually not necessary. Lots of things that people write today are FAR more complex than necessary for no reason other than “best practice”.
Need to instantiate an object? Better have an AbstractObjectFactory, an IObjectFactory, an ObjectFactoryImpl, an ObjectFactory, and an ObjectBean or you will not make it through code review!
There is no room for simplicity in today’s software development world, no matter what language you use, unless you are ready to spend lots of time every single day explaining yourself. It’s sad.
Best practices exist for a reason. If you know for a fact that small lightweight software is not what you are building, and your codebase will probably be bazillions of lines, you can often guess that you probably are gonna need it.
If you want to make Chrome or LibreOffice, you're probably going to need some abstractions.
I've very rarely seen any true IObjectFactoryInstanceMethodAdderMixinAbstractInterfaceBean level stuff, most of the complexity people complain about serves a purpose, at least that I see. It facilitates building complex software with a large amount of reuse and with way more dev effort than one wants to make.
Simplicity is usually harder than complexity, and a lot more design decisions to get right. Complexity is often just following a template. Need to encode something? What's the most popular JSON encoder in your language? Need web stuff? Pick the best practice one, you know it will be around a while and you team already knows it.
You can go overboard, and simplicity has virtues, but it's pretty cool how well everything can just come together without much friction if you really, really aggressively avoid doing anything interesting and "Just do what the industry standard is".
The difference I've noticed is that the complex stuff might bolt together quickly, but it's all broken 6 months later as the shifting sands that underpin it are blown by the prevailing winds of fashion. This sort of software requires constant maintenance, often by large teams, or it immediately begins to bitrot.
Well-thought-out simple stuff, on the other hand, can last for decades and often is quickly adapted to novel devices and environments.
The converse can be just as true, some indie dev instead of a feature-packed, stable dependency will reinvent the wheel because that a subset of those should be enough he/she thinks. Now that code can only handle 80% of the cases because it turned out the complexity of the dependency was not meaningless, and it is buggy and much less maintainable.
> you know for a fact that small lightweight software is not what you are building
Maybe this is the point to stop and ask yourself, if you're doing it right? There is no industry standard for complexity.
It's the "dd" vs "balena etcher" argument. There is very small benefit to the average end user from literally 2000x increase in size/complexity, and for experienced user, fancier tool is actually worse (can't do "ssh foo cat blah | gunzip | dd")
dd vs Etcher is actually my canonical examples for why I'm totally fine with complex software.
dd doesn't verify after flashing. It doesn't alert when you're finished. It doesn't predict the time remaining. It
On Reddit, I see about one "I wiped my disk with dd" post every month or two(Although I suppose using by-path might have prevented some of them, it's slightly more obvious).
Even more relevant is the raspberry pi imaging utility that also lets you set some configuration options when flashing.
It doesn't do anything you couldn't do with a shell script, and Etcher certainly doesn't actually need to be that big.... but it already exists, and using etcher instead of a custom script is one less custom thing to maintain.
I can tell anyone on any platform "Oh yeah just use Etcher" and have a near guarantee they will know exactly what to do and not break anything.
40-80MB seems like a lot. I'm pretty sure you could build a clone in PyQt in a weekend. Etcher is also missing some features(I'd really expect it to be a full studio with plugins to configure the images, make backups, and test in QEMU, considering how big it is).
But Etcher exists already. Using it is one less thing to worry about, one less custom script to possibly maintain, and one less possible mistake that could wipe a disk.
Outside of some 1-2 decades old frameworks, can you show me a singular example for any of your ObjectFactories? Because I really think that the field as a whole moved a bit away from such deep object hierarchies and even in OOP circles will try to use as few levels as possible.
Also, don’t forget that complexity can be divided into two categories: essential and accidental complexity. The former can’t be decreased at all. We should strive to minimize the latter of course, but we often forget about the first.
Just to be a bit more relevant - system boot is a very complex task, so the first type of complexity will be plentyful. It is basically a service/hardware dependency-graph. Sure, you can hard-code singular cases (previous init systems basically did that) but hardware and services are sufficiently different to not make it that feasible, and every single hard coded version can be buggy.
A declarative approach is much saner and it is no accident that basically every OS in wide use switched to that.
I'm not familiar with Java or their concept of trust managers, but I don't really see a major problem with it. It doesn't seem like there is anything confusing. I might not know why they did that, but I don't think I'd have trouble understanding code that uses it.
I'm guessing someone probably thought that a lot of the common use cases would be best done with a factory pattern, and they didn't want you to have to implement it all yourself.
It seems like a very strong form of One and Only One Obvious Way to Do It, extending so far as to have an opinion about what pattern to use.
Or, maybe there's just some other API that needs a factory for something, and so they have to directly expose it, and they don't want there to be more than one way to get one.
I would have to actually be familiar with Java to have a real opinion on it, but on the surface it seems kinda nice. It feels like the kind of thing that might help prevent hacky or quick and dirty code from being written in the first place, and possibly save time refactoring.
You don't think it's weird that there is a static factory method that you have to call in order to get a factory that you have to call in order to get the real thing? IMHO, that's exactly the kind of thing that Java gets rightly ridiculed for.
They could have had just a factory method on the TrustManager class itself.
Also, it is an old, deprecated part of the Java standard library, which lives by different design patterns. Since the public API can’t really change without breaking backwards compatibility, some form of abstraction is required so that later on they can modify the private API well.
There is definitely a significant lack of allergy to complexity among today’s software developers. Complexity is very bad, and it is usually not necessary. Lots of things that people write today are FAR more complex than necessary for no reason other than “best practice”.
Need to instantiate an object? Better have an AbstractObjectFactory, an IObjectFactory, an ObjectFactoryImpl, an ObjectFactory, and an ObjectBean or you will not make it through code review!
There is no room for simplicity in today’s software development world, no matter what language you use, unless you are ready to spend lots of time every single day explaining yourself. It’s sad.