The scale of the Bitcoin mining buildout indicates that grid computing may indeed be feasible after all given the appropriate compensation, and potentially better than or competitive with cloud computing for some applications.
Now that we have a quick way to set up a fine-grained bitcoin micropayments channel between any two machines, an obvious next step is to start generalizing this to the renting out of other compute resources beyond just storage. [1]
Can anyone provide an example of a problem that would be better solved with grid computing/micropayments instead of cloud computing?
Assuming grid computing ends up cheaper, any problem that doesn't depend on high availability.
Why grid computing might be cheaper: people could rent out excess capability that would otherwise go wasted, so their marginal cost of computing is 0 or close to 0. The question is how much overhead would be, but building on a marginal cost of 0 is better for costs than any dedicated solution.
It's not obvious that those considerations outweigh the cost efficiencies of centralization, but it's by no means obvious that they wouldn't or can't.
Think the power grid as an example. If power wasn't fungible in close to real time (like cloud computing today, unless you specifically set it up that way) our prices would be more expensive, because less participants would be producing energy. The fact that anyone can, in theory, produce energy and sell it on a market which accepts the best bid improves the market for everyone. This seems similar.
> any problem that doesn't depend on high availability
Do you have any specific examples of resource-intensive computing problems (other than SETI@home) that do not depend on high availability? Seems a bit like a solution in search of a problem.
I like your example of the power grid as an explanation for how owners of small amounts of computing power might interact with the larger system and have an impact on market prices. But if the analogy truly holds, it would be a good indication of how negligible rewards would be to smaller entities (individuals running computers or smartphones).
Anything that would currently use AWS spot instances. Amazon created it, so we can assume there's a market for "cheaper, but lower availability". Google has something similar https://cloud.google.com/preemptible-vms/
>I like your example of the power grid as an explanation for how owners of small amounts of computing power might interact with the larger system and have an impact on market prices. But if the analogy truly holds, it would be a good indication of how negligible rewards would be to smaller entities (individuals running computers or smartphones).
I'm not thinking of individuals making a living off of their one computer. But surely there's a significant amount of computing power in datacenters that goes unused, or not optimally used. Now, they could manually partner with other datacenters to absorb excess capacity, and bargain a price individually. But if grid computing becomes a thing, all they'd have to do is hook up a bidder to their excess capacity.
Admittedly, focusing on high capacity entities makes it harder to see micropayments as helpful. Even if the minimum transfer amount was $100, it would still be worth it for datacenter operators to trade on such a market.
BoilerGrid has been in use for years. I they have an automatic checkpoint system so your job/program can get evicted (either forced, like with a human logging into the node or via node failure), find a new node, and resume its progress.
We've had micropayment channels since 2012. You can even do micropayment channels with regular money[1]. All that's happened is that 21 put one in their api. I doubt that any hing is going to change that didn't change when they were invented in 2012.
I think equivalence at least with cloud is possible for most tasks except those that require vast amounts of local data.
... for now. Gigabit endpoint links and super cheap storage could close that gap.
Mainframe/PC is one of the great oscillating "cycles of reincarnation" in computing. It's been mainframe for a while now but I get the sense it's ready to swing.
The problem with that is privacy and security. Any machine you trust with the computation can both see the data and can compute dishonestly, unless you use (extremely!) expensive forms of encryption and verification.
It's not micropayments holding back decentralized compute, it's that fact that you are dealing with untrusted counterparties.
Now that we have a quick way to set up a fine-grained bitcoin micropayments channel between any two machines, an obvious next step is to start generalizing this to the renting out of other compute resources beyond just storage. [1]
Can anyone provide an example of a problem that would be better solved with grid computing/micropayments instead of cloud computing?
[1] https://21.co/learn/grid-computing-with-bitcoin-micropayment...