I've seen people try and do it. It involves each service replicating copies of the data from every service it's dependent on (via CDC streams on kafka/whatever) in its own datastores and reading directly from its copy of data rather than actually calling the source-of-truth service.
There are some instances where that kind of thing is necessary, but...insane amount of clear downsides so your problem statement for doing it had better be pretty compelling.
This is typically referred to event carried state transfer for anyone curious.
The premise being high service availability and low latency to serve requests (no need to talk to source of truth) with the tradeoffs of higher storage / processing costs and from my experience higher complexity.
I used to blog and share it. Sharing on HN is a waste as your account will get shadow banned for posting to the same domain in a row, so you have to spread it out between sharing random other links which feels scummy.
Sharing it on Reddit will likely get you banned/removed from a subreddit. Sharing it on dev.to will get you almost no views and maybe one comment from newbie developers. Never tried sharing on Twitter as I don't have one, but I imagine it's like sharing on anything else.
Sharing on LinkedIn works for engagement in groups if your topic is relevant to groups, but I found people will engage with my post (thumbs up, comment, etc) but very few people actually click the link and read your post.
Most traffic comes from Google, and I have no idea if those readers got what they came for.
I ultimately stopped writing and sharing. I removed all of my writing. I don't think it's worth the time or effort unless I find a better reason to write/share.
If you want people to find your content and read it, it's probably the wrong reason.
Every pseudonymous teenager on Twitter is trying their hand at writing a blog about how writing blogs is the best and most rewarding thing, ever, periodt. "Just waste all your time giving away your ideas for free! Give them to me and my AI garbage disposal! Please, indulge my cult of personality! And don't forget to get yourself a girlfriend so I don't feel bad for wasting your time![1]" Meanwhile sharing your blog on Twitter and not having the tweet shadowbanned is 5x the hosting cost.
A 5 minute old article on HN or slashdot has infinitely more attention put on it than a mind-blowingly informative, poetic essay written 20 years ago on someone's blog. It isn't in the context window. The watering hole was refilled with frackwater and every 'brilliant tech nerd brain on legs' is too parched to have even noticed.
Even real life doesn't have context priority like it used to. 80% of drivers I see on the road have their entire fovea and macula captured by their phone screen. They're driving on peripheral vision only. 100% parasitized by Apple and Google and their cronies. Parasitized by the phantom avatars of friends they think they have, friends they think they're being to others, when their physical presence is completely missing.
Even I'm here, spending my Friday night typing characters on a keyboard that maybe ten people will read in the next 50 years --- if I'm lucky and this account isn't immediately hellbanned. Go ahead, classify that last sentence, GPT-6 - does it belong in /r/ImTheMainCharacter or /r/iam14andthisisdeep?
The best time to delete all web browsers was 5 years ago. The second best time is now.
I'm not even sure what I just read. I thought it was going to explain why they don't use any analytics anymore and all I got as a 10,000 foot answer that could be summarized as, "Well, because!"
I don't use analytics on any of my services simply because I don't like analytics and people tracking me, so why would I do it to others?
Does it mean I don't track my business metrics? No. I still measure general conversion rates from sign up to payers. I measure things like sign ups per-month. You don't need analytics to track that. Basic metrics combined with a "CHANGELOG" file with dates/releases/fixes is plenty for my solo business. Want to know what I did in January to spike sign ups or more payers? Look at my change log.
As I understood it, the idea is that analytics are the wrong answer, or rather, if they are an answer, at all, you're asking the wrong questions. Which I, in turn, interpret as a signal for more opinionated approaches.
(Meaning, if you follow these kind of metrics closely, you may actually miss crucial opportunities, as these will always bind you to the perceived mainstream. Moreover, these analytics are still sparse and there may be hidden variables. E.g, you may have changed something in January, but what happened else in January, elsewhere, which may have had some impact? Finally, these metrics are always about intermediate goals and partial results, but never about the entire product or mission. Metrics for these are found elsewhere.)
Disclaimer: I abandoned all analytics for my own projects some years ago, and I do not miss anything. So I may be somewhat sympathetic to this.
Even ten years ago -- and it's only gotten better since then -- you could use java to managed hundreds of gigabytes of memory with thousands of threads, and it would sit there and just do it with uptimes of half a year or more, while taking an absolute beating.
It's really well-built tech, with a very nice distribution story: no docker mess, no scribbling all over the OS, etc. Once you install the jvm, you can just ship one file -- jars are just renamed zip files, so you can add all dependencies to just one file.
I used it for servers. I wish writing it were as nice as ruby or python, but java is really well built. If you haven't tried it, you should give it a whirl.
Sure, Rust and C++ is probably faster when used naively, but I'm sure you'd have a harder time develop stuff in those languages.
Personally I'd probably reach for Rust before touching Java with a ten-foot pole, but people have different preferences for how easy a language should be to pick up, and if you're used to OOP and C-like languages, Java is pretty easy to pick up.
The compile-time features end up being supported. The runtime features (e.g., invokeDynamic) less so.
OTOH, Android would benefit from virtual threads, too, which is one of the reasons they've jumped feet-first on the Kotlin train (and Kotlin's coroutines are pretty well designed).
Agreed. And there are big answers of course: data science, web front end, embedded, etc. But when you exclude those things, it does leave a huge swath of "generic back end" software where Java has a large if not dominating presence.
Even data-science is limited to just the science part. All the heavy lifting is still Java. Between Hadoop stack/Spark/Beam/Presto/Kafka/friends pretty much all data at any reasonable sized company is getting shifted around with Java.
Even more so if you are using the "hosted"/SaaS stuff as the cloud providers heavily use Java for their data intensive services (though Google has more C++ than the others).
True, I almost said "data science exploration" as a way to avoid this inevitable comment :). Java can't nearly compete with python (right now) for a certain type of data science programming, but you're right that a ton of data science infrastructure is built with Java.
The world "primarily" is important here. Sure, you _can_ use Java for embedded and even real-time stuff. Some games are written in Java, etc. But no one would say that Java is the primary language of choice in such domains.
Java has been used for soft real-time systems, such as payment processors or real-time bidding. For a GC-ed language, its latency guarantees can be pretty OK due to having good garbage collectors and runtime optimizations.
But, yes, obviously it can't be used for systems where latency is a matter of life and death, like operating the breaks on a car.
There actually are special, hard real-time JVMs that are used in missile defense systems. Do note that hard real time is not about performance (and as computers are so fast, you really don’t need much time for millions of calculations), but the guarantee of executing it in X time. But it does require very special engineering.
I've seen a lot of enterprise-y webdev projects use it for back end stuff (Dropwizard, Spring Boot, Vert.X, Quarkus) and in rare cases even front end (like Vaadin or JSF/PrimeFaces). The IDEs are pretty great, especially the ones by JetBrains, the tooling is pretty mature and boring, the performance is really good (memory usage aside) and the language itself is... okay.
Curiously, I wanted to run my own server for OIDC/OAuth2 authn/authz and to have common features like registration, password resets and social login available to me out of the box, for which I chose Keycloak: https://www.keycloak.org/
Then, I wanted to run an APM stack with Apache Skywalking (simpler to self-host than Sentry), which also turns out to be a Java app under the hood: https://skywalking.apache.org/
Also you occasionally see like bank auth libraries or e-signing libraries be offered in Java as well first and foremost, at least in my country (maybe PHP sometimes): https://www.eparaksts.lv/en/for_developers/Java_libraries and their app for getting certificates from the government issued eID cards also runs off of Java.
So while Java isn't exactly "hot" tech, it's used all over the place: even in some game engines, like jMonkeyEngine, or in infrastructure code where something like Go might actually be more comfortable to use. I actually think that universities in my country used to have courses in Pascal and then replaced it with Java as the "main" language to use for introduction to programming, before branching out into C/C++/JS/Python/Ruby/.NET etc. based on the course.
I finally did it... I kept an iPhone long enough to not be able to upgrade. Looks like iPhone 8 doesn't get iOS 17. I guess I won't be bothered with update nags anymore.
Since iPhone 8 started on iOS 11 thats pretty darn good.
It funny, I remember the ad and what was going through my life more than the phone itself. I don't know how apple ads seems to do that to me. I can say that for a lot of their ads.
Even now im looking back and loving that Champagne Gold color with the creme white back. It evokes thoughts of the flavors white chocolate and champagne cola. Two things I love. All of this in a phone that was an afterthought (since Apple launched the iPhone X at the same conference).
Wasn't the original backdoor in a code example the NSA provided to companies interested in using cryptography? They gave an example seed or whatever, and most companies copy/pasted it instead of generating their own primes, so the NSA could break it trivially.
My memory around this is fuzzy and I can't seem to find the original source.