Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The industrial revolution didn't really change anything about land.

I didn't say otherwise.

I said the industrial revolution changed what wealth meant. We don't pay for rents with the productive yield of vegetable gardens, and a lawn is no longer a symbol of conspicuous consumption due to signifying that the owner/tenant is so rich they don't need all their land to be productive.

And indeed, while land is foundational, it's fine to just rent that land in many parts of the world. Even businesses do that.

I still expect us to have money after AI does whatever it does (unless that thing is "kill everyone"), I simply also expect that money to be an irrelevant part of how we measure the wealth of the world.

(If "world" is even the right term at that point).

> Arguably, it's much more important, and even more relevant today given how our land use policy is disastrous for our species and climate.

Not so; land use policy today is absolutely not a disaster for our species, though some specific disasters have happened on the scale of the depression era dustbowl or more recently Zimbabwe. For our climate, while we need to do better, land use is not the primary issue, it's about 18.4% of the problem vs. 73.2% being energy.

> So, yes. It is important to ask how consumers will pay for all these robots if they don't have any sort of income that would make using robots economical.

With a 2 year old laptop and model, making a picture with Stable Diffusion in a place where energy costs $0.1/kWh, costs about the same as paying a human on the UN abject poverty threshold for enough food to not starve for 4.43 seconds.

"How will we pay for it" doesn't mean the humans get to keep their jobs. It can be a rallying call for UBI, if that's what you want?

But robots-with-AI that can do anything a human can do, don't need humans to supply money.



> enough food to not starve for 4.43 seconds

I'm having real difficulty reading this unit of measurement. Let me see if I can get this right - a typical person can survive indefinitely on 1600 calories. Let's say that these are provided by rice (which isn't sufficient for a long-term diet, but is good enough for awhile). 1600 calories of rice is about 8 cups/24h and there are about 10000 grains in a cup, so is it that an image can be generated at the same cost as:

  4.43s/86400s*8cups*10000 grains/cup
Being about 4 grains of rice?


Sounds about right, but I don't have unit conversions and I'd count anything less than lifetime sustainable as gradual starvation.


Nope. Land is important because everything rest on it. Even radio spectrum and orbitals can be regarded as a form of 'land'.

Georgism doesn't exist in a vacuum. It wasn't like they were formulated during when the time when wealth 'meant' land. It was during the industrial revolution, possibly as a response to the problems they see in their society, problems we're still dealing with today.

No longer it merely meant land where productive yield of vegetable garden goes. Anything that capital sits on is land. That includes your factories and your datacenter. Yes, that include renting land on someone else. That's land policy.

Housing? Land policy. Pollution? Land policy. Transportation? Land policy. Can't afford to live? Likely your biggest ticket items include transportation and housing. Land is more important than ever.

Now, what does this have to do with AI? I would caution against thinking money or capital to be irrelevant or making any definitive prediction about the impact of AI or when or how they will come.

Edit: I see that you added stuff, but you have a narrow conception of land policy.


> Nope. Land is important because everything rest on it. Even radio spectrum and orbitals can be regarded as a form of 'land'.

Then you define land so broadly that the empty vacuum of space, which robots are much better suited to than us, can exploit trivially when we cannot.

If you want to, that's fine, but it still doesn't need humans to be able to pay for anything.


The orbital are literally scarce resources, as are radio spectrum. If you have people just doing whatever, you'll get Kessler syndrome, especially as our orbits are filled with more satellites each year. Similarly you just can't have random folks blasting out radio signals at random.

Yes, satellites are robots. However, they have no agency. Incentive structure decides if we have kessler syndrome, which then direct humans to solve problems with robots.

So, yes, they are either directly analogous to or are literal form of land.


Space is much more than circular orbits around earth, and is not a scarce resource — it's big enough that you can disassemble the earth, all the planets, all the stars, all the galaxies into atoms and give them so much padding it would still be considered extraordinarily hard vacuum. Something like 3.5 cubic meters per atom, though at that scale "size" becomes a non-trivial question because the space is expanding.

Which reminds me of a blog post I want to write.

> Similarly you just can't have random folks blasting out radio signals at random.

That's literally what the universe as a whole does.

You may not want it, but can definitely do it.

> Yes, satellites are robots. However, they have no agency.

Given this context is "AI", define "agency" in a way that doesn't exclude the people making the robots and the AI.

> Incentive structure decides if we have kessler syndrome, which then direct humans to solve problems with robots.

Human general problem solving capacites do not extend to small numbers such as merely 7.8e20.

For example, consider the previous example of the moon: if the entire mass is converted into personal robots and we all try to land them, the oceans boil from the heat of all of them performing atmospheric breaking.

And then we all get buried under a several mile thick layer of robots.

This doesn't prevent people from building them. The incentive structures as they currently exist point in that direction, of a Nash equilibrium that sucks.

Humans do not even know how to create an incentive structure sufficient to prevent each other from trading in known carcinogens for personal consumption even when labelled with explicit traumatic surgical intervention images and the words "THIS CAUSES CANCER" in big bold capital letters the outside.

If anyone knew how to do so for AI, the entire question of AI alignment would already be solved.

(Solved at one level, at least: we're still going to have to care about mesa-optimisers because alignment is a game of telephone).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: