Hacker News new | past | comments | ask | show | jobs | submit login

How is "converting the entire mass of the solar system into dollar bills" conceptually different from "converting the habitability of Earth into dollar bills" or "converting the health problems of human beings into a maximal amount of dollar bills", which is precisely what many corporations _actually do_?

The "corporation as AI" metaphor isn't about some abstract future possibility, it's an explanatory mechanism for how the world is so thoroughly messed up _right now_.




Yeah, I agree. My original post was bringing up what I thought was an interesting comparison between hypothetical software AI and the corporations-as-AI metaphor presented in the talk.

Corporations _are_ acting as misaligned optimizers. Solving that problem is hugely important. However, the "AI" comparison breaks down somewhat when you start thinking about how we might actually fix the problem. With corporations, we (i.e., states) have tools that we can use to regulate bad actors. Software AI, however hypothetical at the moment, seems likely to be a different game all-together.


I don't know that we won't have tools, albeit different ones, to regulate a bad AI. AI need more than just intelligence and agency. They also need effective ways to interact and affect their environment. That boundary is where we are likely to develop tools to limit and regulate them.

If they are truly general AI then it's likely that their reaction to that limitation and regulation will be not dissimilar to a person's but I see no reason to assume that limiting them will be impossible.


Sure! I don't see any reason why it would be impossible either, but the (hypothetical) problems are very interesting. Starting with the most basic problem of all: how do we even specify what we want the AI to do? The whole field of AI safety is trying to figure out a way to write rules that an agent wouldn't instantly try to circumvent, and to find some way to provide basic guarantees about the behavior of a system that is incentivized to do bad things (just like corporations are incentivized to find loopholes in the law, hide their misdeeds, and maximize profits at the expense of the common good).


That assumes an ideal corporation in the physics sense - real world ones have "corruption" in the form of their employees engaging in their own agenda. Take sexual harassment, passing up talent from bigotry, and office politics. They actively harm the profitability and yet they exist at all levels.

The incentives are fundamentally what shape the systems including the corporations. Blaming corporations alone is a simplification - the same incentives converge to the same outcomes akin to how power vacuums are filled by warlords.


> They actively harm the profitability and yet they exist at all levels.

> The incentives are fundamentally what shape the systems including the corporations.

These are key observations, and sadly I suspect there's a Prisoner's Dilemma style thing going on which makes corruption, sexual harassment, bigotry, and office politics somehow "rational" and individually maximising behaviours (for corporations, as well as the individuals they're composed of) whenever any of the competing corporations are known or suspected to be behaving corruptly.

Combined with the combative nature of thinking/reporting about corporate results - where for example FAANG stock price performances are compared to each other, assigning winners and losers, without any relevant incentives or accolades for the entire tech industry having grown the value of the entire sector. A corporation with a 15% YoY increase is deemed a "loser" if some of it's competitors manages 20%.

And we've spent well over half a decade demonising anyone who criticises Capitalism - thus deeply entrenching incentives that are poor for society as a whole, but which have "less poor" outcomes for the corporations prepared to be most ethically barren.


Being generous and ethical is also a form of human corruption harming the corporate entity. From the corporation's hypothetical point of view, any time a person doesn't do exactly what the corporation needs, isn't the perfect mindless drone, (or isn't creative in just the right inoffensive way), they're a cancer cell in the corpus. But I think this goes for red tribe, as well as blue tribe thinking, as you posit.

After all, you'd be hard-pressed to argue that corporations, especially silicon valley corps, give more lip service to red tribesmen than blue. Maybe that's just because the blue tribe is the more powerful, and the corps are rightfully saying the magic words that allow them to keep their profits.


The fact that you're alive proves there is a pretty big difference, whatever it is


Or perhaps that your particular perspective is blinkered to seeing too short a timeframe?

There's a very good argument top be made, I think, that "capitalism" is the biggest and most dangerous ponzi scheme ever invented. Most people will happily participate thinking "this is fine .gif" while untapped suckers/resources keep dropping enough "return" to early "investors", but when the house of cards all collapses there will be no underlying foundation for the vast majority and only the people at the very very top of the pyramid scheme ever actually benefited at all.

Ever stopped to wonder why Musk ands Bezos are so interested in going to Mars???




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: