This is actually pretty hard to do. We all have heard of internet rot/dead internet theory.
One shout might be to link to internet archive instead of the resource directly. Though we can't be sure internet archive will keep the current system working as is (e.g. search params may change etc)
The only solid solution I have is to set up a foundation and pour money in so that they will be responsible for upkeep. But that would be an hassle to execute.
100 years in internet time is very long, internet as we know it now hasn't been along for 100 years.
I wonder if there aren't services that specialize in this.
Have thought about this a bit. I think there's a "business model" where a non-profit foundation charges a very high price (say $50 or $100/gig) and the interest on that pays for the hosting and admin. One issue is startup risk, if you don't get enough people wanting to store data "forever" it won't be sustainable.
The foundation has a remit to also do some related "good works". The idea is that the pot of money (and the interest it throws off) acts as an incentive to keep the foundation going. Eventually the cost of hosting "legacy" data should drop close to zero. You could run it as an overlay on two clouds initially to avoid capital outlay.
I think you would want librarians / archivists on the board. It wouldn't require much in the way of software, making something that could last in the long term is more of a governance problem than a technical one.
GraphQL is good at complex specificity. Leaving it to the caller how it wants its data structured in the return.
The struggle with it is more on the provider side. As you'd have horrible performance if you'd just rawdog an ORM onto it. The complexity of combining is where you'll struggle.
Last I played with GraphQL was with PHP. There is exactly one package that did GraphQL at that time. Where we had to define the models both in yaml and php.
Though this could probably be solved with dynamic building of the yaml.
Also last I checked the testing utils werent there yet.
Though it is really cool to be able to do supergraphs if you decide to do microservices.
Alternatives are json and protobuf. GraphQL kind of lands between big json blobs and protobuf's binary in size. Due to it's possible exactness. But that is wholly dependant on implementation.
I have a 12 thread 64GB RAM VPS for 30 euros a month
Honestly, ssh for running commands + mounting ir locally(SSHFS) works great for me, but I never was able to install FUSE correctly on Windows and it's a bit annoying to install on MacOS(third party kernel module)
I mainly run some code on it and the occasional gameserver so I have no need for a GUI though I have been entertaining the idea of using Rancher to maintain docker/k8 stuff
Technically my laptop is higher performance then the VPS but running my docker env alone takes about half the memory I have (16GB)
Those companies got value via selling products with a margin to their customers.
So, technically their customers lost money for those companies to gain money.
Not that there is anything wrong with that. Just pointing out how I think your logic is flawed.
I feel you are going back to the early days of index websites before they realized the internet is to large to curate by hand and either died or became search engines.
I wish you luck though.
The one part I feel YouTube is failing currently is content suggestions, it used to be crap for years until it wasn't but it feels like it's goimg back to a hoshpot of weird and random suggestions where a year ago I felt like I wanted to watch 25% of the content now it's either 0 or 2 %
How would this even work? Normally for such signatures there is either a hierchy of trust (ssl) or you know them personally (PGP). Even if there was a working model, if Twitter itself got hacked, an account could just claim they lost their key and made a new one.
For most people that's true but for extremely high profile like Barack Obama, Jeff Bezos, Bill Gates, they will have other channels to confirm their identity (and their current public key).
If you want to stretch to the logical extreme full-dystopian deepfake future where any message could be fake then the only real solution is the free press. A distributed network of groups and individuals that will each independently validate the things that people do and say then make those validations available for the general public to review.
Why is Linux better? For general software development I'd say the toolchain, I haven't found a tool I needed not having a Linux version or alternative in a very long time.
Though if you release your games mainly on pc I'd say Windows is better due to being the platform most of your users will be on.
While major companies are pushing for Linux it will take decades from now to actually become the main platform. Due to a lot of reasons I won't go into, but being a programmer you can generally deal with them.
One shout might be to link to internet archive instead of the resource directly. Though we can't be sure internet archive will keep the current system working as is (e.g. search params may change etc)
The only solid solution I have is to set up a foundation and pour money in so that they will be responsible for upkeep. But that would be an hassle to execute.
100 years in internet time is very long, internet as we know it now hasn't been along for 100 years.
I wonder if there aren't services that specialize in this.