Hacker News new | past | comments | ask | show | jobs | submit login

Imagine this but instead of needing any CLI skills, you just install a "blog" app on the phone, do a quick OAuth2 flow with your domain provider[0] to connect a domain through a tunnel, and you're ready to start posting. This is essentially my vision of the future of selfhosting. We have a lot of work to do, but the technology is all there.

Also, if you're looking for similar tools to pinggy, I maintain a list here[1].

[0]: https://takingnames.io/blog/introducing-takingnames-io

[1]: https://github.com/anderspitman/awesome-tunneling/




How can a mobile phone's network be as stable as that of a datacenter? Without that stability it's not going to be practically useful.


Personal sites don't really need the kind of network stability we've come to expect. It would also be nice to see a move to more offline-tolerant networks for a lot of things.


This is the kind of answer our industry needs more of. Chuck Moore would approve.

Q: "How will it handle scaling to millions of users?"

A: "By crashing. But it will handle 100s or maybe even 1000s, and you currently have 3, so chill."


It's not about scalability but about reliability.

If your 3 users cannot read your blog then good luck getting to 10 users.


Your 3 users are probably okay with 99% uptime. No need to sweat over 5 9's for a hobbyist/personal site.


Five nines, huh? We've got six eights over here!


If your three friends cannot reach you 24/7 good luck getting to 10 friends.

We need less reliable systems, not more.


So are you saying the mobile network you use is absolutely terrible and you have a signal less than 99% of the time, or are you saying that if a blog you read fails to load one out of a hundred times you visit you stop reading it?


Yeah my mobile network is less than 99% reliable. But there are other factors such as battery dying or just dropping the phone.

I mean, that's why a mobile phone is being used, no? Because otherwise, a Raspberry Pi would suffice and would be cheaper and more reliable.


Yeah my mobile network is less than 99% reliable.

No signal for more than fifteen minutes every day? That must be frustrating.

But there are other factors such as battery dying or just dropping the phone.

Neither of these has happened to me. If they did, the uptime of my personal blog would be a much lower priority than fixing that problem. It's a blog, not a business.


You know what is offline-tolerant? NNTP and email.


>> You know what is offline-tolerant? NNTP and email.

Imagine a distributed social network where updates come via email. Updates are identified by the email client (or other) and automatically sent to whatever program needs to process them. You end up with a local copy of everyones stuff that is kept for a period of time.


I have been working on idea for generic store-and-forward messaging. Basically, email for everything. The idea is that store-and-forward is more reliable when there are limited or no network. It would also work well for interplanetary communication.

For some applictions, it is simple to fallback or use always and luckily the basics are included. Unfortunately, lots of applications won't work at all and many would have to be rewritten.


You're probably aware of http://www.nncpgo.org/ then


Why is this website hosted in Russia (given current geopolitics), and why does this website have a certificate issued by ca.cypherpunks.ru instead of say Let's Encrypt?


What's the problem exactly?


Why not?


Also good old RSS.


Right, and now the site doesn't even load from the minor traffic from HN.


I guess a caching layer could help, like CloudFlare's DDoS protection that still tries to serve a page if a cached version exists.

At that point I'm not sure if it's functionally different to syncing markdown files to something like S3 or GitHub pages.


> I guess a caching layer could help

In practice this works only for trivial apps, that have no dynamic content, don't serve large files, don't see a lot of traffic, doesn't come from all over the world (each PoP has a separate cache), etc. CDN caching is opportunistic, most services assume server-grade hardware at the origin that can take some "warmup" load on its own.

Also if you're introducing a third-party CDN/cache, you're already throwing away a whole bunch of reasons for self-hosting in the first place.


In practice this works only for trivial apps, that have no dynamic content, don't serve large files, don't see a lot of traffic, doesn't come from all over the world (each PoP has a separate cache), etc.

So something exactly like a blog then.


True, but this means your solution is competing with Github pages, Netlify, etc. and your visitors are still subject to the whim of the caching layer. I'm not aware of any classical CDN product that works Great even when the origin server doesn't. Building a CDN for that purpose only would be extremely niche - generic CDNs with a hand-tuned caching policy are great at many other things, such as live video delivery, so you'd be building a non-generic CDN in a saturated market. Then the question of how does the CDN make itself aware of the content it needs to prefetch... once you devolve into that, well you've reinvented git push to Netlify, but with 10x the amount of quirks, oddball architecture, and less flexibility.


> apps

Make websites not apps.


If you're building something as complex and convoluted as a blog hosted on a smartphone being tunnelled to the outside world and proxied through a purpose-built CDN, I don't think you can call it a website anymore.

I do agree, let's build websites instead.


Only works if the content is popular in your region and you are only half-self-hosting.


Please (everyone), stop pushing cloudflare on to everyone.


I'm not pushing anything, just haven't seen pass-through proxy that has a similar failure mechanism for when websites are down. They have enough of a market share without me promoting them anyway.


That's a feature of basically any CDN and some existed well before Cloudflare entered the market.

Nowadays you find dozen of CDNs. A personal site may be fine without it. If your plan is having an offline version ready to be served, as you expect a lot of downtime, distributed cache might not be the best architecture. Some CDN offer a dedicated layer of cache in front of your origin, but that sounds overkill for a personal blog.


I would envision something like this being used with an older phone permanently connected to a charger and Wi-Fi or possibly a USB docking station with Ethernet, not running on your personal phone over mobile data. The latter would be terrible for battery life.


I was kicking around a similar idea a few months ago but for a single-user Mastodon instance. Obviously a phone is a huge single point of failure, but there are likely dozens of hobbyist use-cases like this that don’t need strong reliability guarantees.


You know, I read this and got exciting thinking "I could build that!". Then I got to thinking about the security implications of running a webserver on a personal phone exposed to the internet.

I think I'll pass.


It doesn't need to be a personal phone. If you get a new phone, you could do a factory reset on your old phone and use it as a webserver in this way. And since you aren't taking it with you anywhere, you can leave it at home plugged in and on wifi, for better uptime than your regular phone.


What would be really cool is if we could easily share hosting burden of m sites on n devices.

As an alternative to paying a subscription fee for some service, you could instead just have your hardware do enough compute/storage to offset your usage.


Also called "maidsafe" but they've been at it for like 10 years now and still no product.


Very disingenuous as they've release many products over the years. With each of these releases they've moved the development of the Safe Network forward. Simply take a look here - https://safenetforum.org/c/development/releases/76/ A test network was just closed and they'll be fixing a couple of things releasing another ASAP.

Good luck!


I've still got some learning to do, but the whole kademlia dht thing is a pretty interesting middle ground between the IPFS approach (not enough redundancy) and the blockchain approach (too much redundancy).

I'm not sure it's ideal because if the network partitions you're going to end up with a more or less random assortment of data in your partition. Better for the biology data to end up on the partition with the university that specializes in that and the physics data to end up nearest its users as well. (This may be handled at a higher level, but I haven't gotten that far yet.) Nonetheless, I'm glad somebody is exploring the design space.


What products?

I'm not saying they haven't been doing things, but the big decentralized Internet they touted from the beginning still doesn't exist AFAIK.

When I can host a website and have it stay up indefinitely with zero maintenance.. then I think they have something.

I'm sure it's an enormous amount of work and I did buy some coin so I don't mean to be disparaging, but I stopped tracking updates after a couple years because it's all been technical mumbo jumbo and they don't seem to be moving the needle. They've had many "test networks" over the years.


Tailscale VPN + Tailscale Funnel + Termux?


That exists, IPFS


I know that people can use IPFS to voluntarily pin data so that it is redundantly hosted, but supposing there's some paid service that goes with that data, I don't think IPFS helps identify the users that have been pinning the data so that you can offer them the service for free.

In theory the users could just mine filecoin and pay for the service in that, and then I could dispatch storage contracts to the users in ways that ensure that the entire dataset stays pinned, but so far as I know that's not built in. Besides, a spare phone isn't up to the task of mining filecoin (you need a pretty beefy rig).

I think it would be cheaper and simpler to keep the "have you pinned enough data to justify a free premium account?" Logic all together in one place and not introduce the mining process as an intermediary, but not so cheap and simple that every app that does it should have to implement it themselves.


You could, but if it's in the app store for the masses to do, most of them won't be, and that's sort of the point of the suggestion here in my mind. Easy self-hosting for the masses with the device they already have.


There are lots of old phones out there, and rather than getting binned it's nice to think of them being put to use. With the exception of the rather young, nearly everyone's had more than one phone in their lifetime, and even an old smartphone is more than powerful enough to run a simple webserver.


I agree with you, but I think your missing my point. If I go and put something like this in the Play/iOS stores, what percentage of users looking to run a small site do you suppose run it on the phone they use everyday vs running it on a wiped old device that they have plugged in in the corner? If you think most will take the proper precautions you have more faith in humanity than I do. Personally, I think even with the somewhat nerdy nature of the app >50% will be running it out of whatever is in their pocket, no matter what kind of warning banner I might put somewhere. And I'm nopeing out of handing that foot-gun to them.


If the app is sandboxed, then it's the same level of risk as browsing the web on your phone. Sandbox escapes are rare enough that the people who find them aren't deploying them against randos.


being accessible 24/7 would burn battery like mofo tho.


> This is essentially my vision of the future of selfhosting

What if you lose connectivity? What if you run out of battery? What if your phone breaks for some reason?

For a blog I really don't see the advantage compared to a free static host.


99% of the challenge in running a blog is authoring posts. Serving the content is trivial. If using an app improves writing stuff it's a win.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: