Stop asking the market to look after collective interest. This is the job of the government. The ultimate effect of virtue wanking CEOs and corporate governance is to deceive people into thinking democracy is something that can be achieved by for profit organizations and that they can forsake the formal binding of collective interest through law.
It's nice if people are nice but it is not a bulwark of the collective good. It is a temporary social convenience. The higher that niceness exists in the social order, the greater its contemporary benefit but also, the more it masks the vulnerability of that social benefit.
It matters if Sam Altman is Ghandi or Genghis Khan in a concrete way but you, as a citizen must act as if it doesn't matter.
If AI poses a danger to the social good, no amount of good guy CEOs will protect us. The only thing that will is organization and direct action.
It's not the job of the government, it's the job of citizens. Citizens MAY outsource parts of this to governments, but we primarily need labor and the working class to have the majority of economic power.
> It's not the job of the government, it's the job of citizens
If only there was a system of government designed from first principles to reflect the will of the people...
This is just librarian tautology. It's not wrong, but any "citizen" organization you can imagine to implement whatever policy it is you want[1] is going to be isomorphic to a democratic government.
The government looks after its own interests. That is what an institution does. A government isn't different than a corporation or any other organization this way.
Delegating powers to a government for the common good does not change any of this. The people still have the job of ensuring the government carries that out properly because its priority is and always will be foremost itself, so the people have to ensure the government's interests coincide with their own. A government is not just some will-of-the-people-reflecting automaton that you can put in place then wash your hands of.
The private sector's interest aligns with the market, at least on the average, as long as they are not allowed to illegally obtain a monopoly.
The market, then, is assumed to be rational on average, and thus all participants in the market would make decisions in their own best interest, and thus, obtain the best outcome for themselves that is possible. Those who do not choose optimally is then selected out by economic darwinian competition.
The gov't has a role to play, but it does not follow that _more_ gov't regulation makes for a better society.
This may be true in theory, but we are seeing rather different development in the West.
1) power seems to have a tendency to concentrate, which ultimately leads to corporate oligarchy.
2) most markets have high barrier to entry which breaks the alignment of market and interests
3) the concept of rational markets is not as clear cut as it is often perceived, and most studies have failed to incorporate findings from related to bounded rationality
However, the government's incentive is to stay in power and not let that happen. The tool that government has is regulation.
This is because the private sector can't (for the most part) coerce you to deal with them. If the government can keep a lid on external costs, the private sector has to create value people want or they lose money and go away.
The government can force you to give them your money and obey their laws. This is what makes them uniquely powerful and dangerous. In a democracy they do have to get your vote once in a while, but that's within a framework they mostly create and police, with extremely high barriers to entry and very little reward for doing "the right thing".
It baffles and saddens me that otherwise thinking people can be so eager to hand power and authority to the government. I understand that -- after a lot of thought and reading and with much reluctance -- one could conclude that the government should be given a particular power because the alternatives are worse.
We have to remember that "we" have selected the best of bad options. This system is not good and will get worse by time. People have to remember that we must not let power accumulate to specific groups, be it private or public sector. Unfortunately the tendency seems to be towards the opposite of what is "good" for average individuals. If government has too much power it cannot be held accountable, and the interests have disaligned.
>The private sector's interest aligns with the market, at least on the average, as long as they are not allowed to illegally obtain a monopoly
This is the basic capitalist lie - Do you think Norfolk Southern's interests align with the "market" of Palestine? On what planet do Amazon drivers have the same incentive alignment a Amazon shareholders? No, the argument is "tough shit" go find some place that does, it's a "free market" just move your entire life any abandon community to follow around making money for a private dictator.
Unfortunately the entire profit or bust ethos isn't constrained to one company - any company that has profit as it's primary driver, aka any public company or VC backed company, MUST pursue profits over all things OR THEY WILL BE KILLED BY THEIR OWN INVESTORS. Further, now it's every individual that is required to pursue profit at all costs or - go broke and die - without healthcare because the "market" of healthcare isn't aligned with you the customer.
This lie is so pervasive that the entire field of Economics ASSUMES capitalism as the only possible economic system today (Economics was my field of study!) and anything other than it is "heterodox."
So no, you're simply describing the devastatingly broken system as though it's the only suitable option
In fact, all of the economists that wrote originally about the "market" described precisely how it would be corrupted into what it is today
Go read Veblen from 1899: Theory of the Leisure class then read chapter three of the wealth of nations and then come talk with me
The assumption of a benevolent government of experts and moral leaders making decisions for the greater good has killed literally millions of people stolen trillions of dollars in war since WWII, and that's just counting western liberal democracies.
If you don't implicitly trust "the market", and you shouldn't, then you can't trust the government. What's it going to take to convince you that handing over more of your rights and money won't enable the government to solve problems it previously could not have.
"The government should fix that" is no less ridiculous than "the free market could fix that". Which is to say it can occasionally be true with a lot of caveats, and often wrong.
It's super interesting that you immediately started railing against the government, which I didn't even mention. I'm critical of currently-popular economic dogma, and I don't understand why that should imply that I'm some kind of authoritarian socialist.
That's a nice bumper sticker, but managed markets and other non-laissez-faire economies, like the ones in India and China, could make a similar claim, even though I wouldn't want to live under them.
There's more than two ways to organize human labor
This false and defeatist dichotomy between Government dictatorships or Private dictatorships lays bare how truly depraved most thinking is around what constitutes human flourishing and liberty.
> It's not wrong, but any "citizen" organization you can imagine to implement whatever policy it is you want[1] is going to be isomorphic to a democratic government.
I agree government is useful and needed sometimes. But laws are slow, blunt instruments. Governments can’t micromanage every decision companies make. And if they tried, they would hobble the companies involved.
The government moves slowly. When AGI is invented (and I’m increasingly convinced it’ll happen in the next decade or two), what comes next will not be decided by a creaky federal government full of old people who don’t understand the technology. The immediate implications will be decided by Sam Altman and his team. I hope on behalf of us all that they’re up to the challenge.
> And if they tried, they would hobble the companies involved.
Well, yeah, that's the point of regulation: to limit corporate behavior. There are plenty of other highly-regulated industries in the US; why shouldn't AI be one of them?
The emergent behavior from this however is not that we get healthy competition, it's that the big guys have the money, connections, and understanding of the process to still do everything they want, while the little guys can't get off the ground and compete.
This is a necessary evil in the case of an industry like aviation where there's a massive and immediate risk to human life if you get it wrong.
If you see AGI as life threatening (in a physical sense) it's an understandable stance to take. However if AGI is threatening moreso in a social sense I don't think concentrating the power behind it is going to be a societal good.
I genuinely believe we as a society will get fucked by corporations for everything we can give if we don't either radically socially reform for incoming AGI, or have competitive open developments in the area not beholden to corporate interests.
I don't know, power concentrates plenty in unregulated industries. Google has owned search for over two decades, and it's not like there's a ton of government regulation preventing anyone from starting a search engine. pPople have tried, they just haven't succeeded because Google was that much better and that much farther ahead, and they used their advantages to widen the gap. The same is true for advertising, it's not like the government was stopping anyone from competing with Google and Facebook, but other companies just couldn't get there.
I'm not sure that AI is going to look that different. Is there any reason to believe that if OpenAI wins the majority share, that any competition is going to be able to dethrone them?
We could imagine a world where the US government broke up big tech companies in the late 2000s or early 2010s, and it sure seems like that world would be in a much better position today, due exactly to increased competition. Instead we have a bunch of big companies that don't take risks or release ambitious new products because they're too busy trying to defend their moats.
Broad enough laws don't have loopholes. GDPR for example had the wisdom not to define specific types of personal information (such as name, or IP address) but instead any data that could be correlated to identify a person. That puts the ball in the corporations' court to make sure they stay well clear of the dividing line.
Delegated councils (like the FDA, FTC, FCC, etc.) and courts can be used to make decisions about edge and individual cases, and do so much closer to real time than Congress can act.
Governments regularly successfully define things much more vague than "is this AI?"
>The immediate implications will be decided by Sam Altman and his team. I hope on behalf of us all that they’re up to the challenge.
Will they really be determined by OpenAI? So far what Altman has accomplished with OpenAI is to push a lot of existing research tech out into the open world (with improvements, of course.) This has in turn forced the hands of the original developers at Google and Meta to push their own research out into the open world and further step up their internal efforts. And that in turn creates fierce pressure on OpenAI to move even faster, and take even fewer precautions.
Metaphorically, there was a huge pile of rocks perched at the top of a hill. Altman's push got the rocks rolling, but there's really no guarantee that anyone will have much say in where they end up.
What if I told you, there is this startup that is working on really effective ways of proliferating plutonium. And that it is under pressure to keep the hype, so there isn’t much time to organize the storage of proliferated materials. And the thinking is that even if it explodes, since startup is not building a bomb, it can’t really make that much damage.
Oh, and also imagine that nobody really knows what plutonium is, there is just one startup that had figured out how to turn uranium to plutonium.
Also that understanding of uranium for now is at the level when it was just discovered. So people are still handling it by bare hands. And trying to sell shiny uranium toys.
> existing research tech out into the open world (with improvements, of course.
That’s quite the understatement, isn’t it? Neither Google nor Meta have been able to demonstrate something to the public that resonates even close the the altitude that OpenAI has.
I'm not sure if this is research quality or product tuning. I tried using Bing's chatbot and I found the experience to be vastly worse than OpenAI's version, yet Bing is ostensibly running on exactly the same platform (GPT-4). I think OpenAI turned the dial to "loquacious and friendly" and we humans are interpreting some pretty-similar underlying tech as having vastly different underlying quality, because we're impressed by that sort of thing.
Once you're down to user interface, you're no longer in "insurmountable technical advantage" territory.
But more critically, chat.openai.com is more of a tech demo than a serious product. So one can optimize the UX for "chatty and engaging" without being too concerned for accuracy. Bing, on the other hand, is intended to be a search product. It's noteworthy that adjusting the same model to provide a more accurate search experience seems to have snuffed out a lot of the magic.
They'll be decided by AI, not by governments, corporations, or individuals.
There's still a kind of wilful blindness about AI really means. Essentially it's a machine that can mimic human behaviours more convincingly than humans can.
This seems like a paradox, but it really isn't. It's the inevitable end point of automatons like Eliza, chess, go, and LLMs.
Once you have a machine that can automate and mimic social, political, cultural, and personal interactions, that's it - that's the political singularity.
And that's true even the machine isn't completely reliable and bug free.
Because neither are humans. In fact humans seem predisposed to follow flawed sociopathic charismatic leaders, as long as they trigger the right kinds of emotional reactions.
Automate that, and you have a serious problem.
And of course you don't need sentience or intent for this. Emergent programmed automated behaviour will do the job just fine.
> Essentially it's a machine that can mimic human behaviours more convincingly than humans can.
Isn't this impossible by definition? Nothing can be more convincingly human than humans, or else it would be something else.
Perhaps you mean mimic charismatic or intelligent humans more than most, or the average, human?
> Once you have a machine that can automate and mimic social, political, cultural, and personal interactions, that's it - that's the political singularity.
Do you mean online only? Because I imagine we're still quite a ways from physical machines convincingly mimicking humans IRL.
> And that's true even the machine isn't completely reliable and bug free.
If the machine makes inhuman mistakes the humans will likely notice and adapt.
> Isn't this impossible by definition? Nothing can be more convincingly human than humans, or else it would be something else.
Why isn’t it theoretically possible for an AI to pass the Turing test so hard that more than 50% of the time humans think the AI is the real human? That would effectively be more convincingly human (to humans) than humans are.
> Once you have a machine that can automate and mimic social, political, cultural, and personal interactions, that's it - that's the political singularity.
The machine can mimic, but it is still completely reactive in its nature. ChatGPT doesn't do anything of its own accord, it merely responds. It has no opinion, no agenda, and no real knowledge or understanding. All it can do is attempt to respond like a human would, but it cannot reason on its own. Ask it about its views on a political issue, and it won't think about its stance and the reasons for taking it, but it will just produce what it has trained an answer to that question ought to look like.
The panic about the machine takeover is completely overblown, driven by people who don't really understand that these machines are and how they work. We are still far, far away from points where AI would be capable of actually making political decisions.
> When AGI is invented (and I’m increasingly convinced it’ll happen in the next decade or two)
Aside from the issue of AGI being extremely difficult to capture in a proper definition, what leads you to believe this? Recent developments like LLMs and stable diffusion are impressive technology but I don't view them as remotely relevant to achieving AGI.
This is not a good idea. There is a conflict of interest here. The fundamental goal of government is regulation, the fundamental goal of a business is profit.
Both the morals of government and corporations are therefore very different. I call them Guardian and Commercial Syndromes respectively. When you combine these two syndromes you create monstrous hybrids.
Allow me to elucidate:
An entity in the Guardian syndrome values honor and uses forceful tactics to maintain order and regulation.
An entity in the Commercial syndrome values money and tries to use whatever means possible to gain profits through trade agreements.
You don't want a commercial entity to have the power to enforce things and you don't want a government entity to desire profit. Both hybrids lead to bad things and it's the root of corruption in the world today.
You know how the old medieval English Nave formed? So many pirates lurked off the coasts of medieval England that London merchants went to the expense of financing a fleet of fighting ships and gave it to the Crown.
Why give it to the Crown? Because you don't want some commercial entity running the Navy using methodologies that circumvent trade agreements and violently funnel trade and profit to a singular commercial entity. You need an entity outside of the commercial sphere; An entity in the Guardian Syndrome. If a commercial entity controlled the Navy you get something similar to the Mafia or Yakuza; Basically organized crime.
Only the government should enforce the laws of AI because there is in theory no conflict of interest. But practically speaking the US government is pretty heavily hybridized already (See military industrial complex). Still it's the best option.
I don't know, isn't history really just a series of specific people doing concrete actions?
Do you think on some level the idea of some abstract 'government' taking care of things is just a narrative we apply to make ourselves feel better?
Sure individual decision makers in that government can concretely affect reality, but beyond that are we just telling a story and really nobody is 'in control'?
The more we remember the possibility of true collective self determination, the more likely we are to survive all this mess we're making.
These days we are constantly bombarded by this contradiction of individualism being primary and desirable, but at the same time impotent in the face of the world this individualism has wrought. And its all a convenient way to demoralize us and let us forget how effective motivated collective interest is. Real history begins and ends with the collective!
Yes. There is a reason that when Britain felt threatened by the turmoil in France they didn't just bar unions or political clubs. They banned "combination" almost entirely in general.
tl;dr: Nobody is steering the ship because they don't know they're on a ship. Or that the ocean exists.
It's hard without doxxing myself or calling out specific people and organizations which I'd rather not because I'm a nobody and can't afford lawsuits, but for various reasons I ended up political education and marketing for civics advocacy. Ish. To be semi on topic, I know some people who are published in the WSJ (as well as the people who actually wrote the pieces). I'm also a 3rd generation tech nerd in my mid 30s so I'm very comfortable with the digital world - easily the most so outside of the actual software engineering team.
I've spoken with and to a lot of politicians and candidates from across the US - mostly on the local and state level but some nationally. And journalists from publications that are high profile, professors of legal studies, heads of think tanks, etc.
My read of the situation is that our political class is entangled in a snare of perverse disincentives for action while also being so disassociated from the world outside of their bubble that they've functionally no idea what's going on. Our systems (cultural, political, economic, etc.) have grown exponentially more complex in the past 30 years and those of us on HN (myself included) understand this and why this happened. I'm a 3rd generation tech nerd, I can explain pretty easily how we got here and why things are different. The political class, on the other hand, has had enough power to not need to adapt and to force other people to do things their way. If your 8500 year old senator wants payment by check and to send physical mail, you do it. (Politicians and candidates that would not use the internet were enough of a problem in 2020 that we had to account for it in our data + analyses and do specific no tech outreach). Since they didn't know how the world is changing, they also haven't been considering the effects of the changes at all.
Furthermore, even those of them that have some idea still don't know how to problem solve systems instead of relationships. Complex systems thinking is the key skill needed to navigate these waters, and none of them have it. It's fucking terrifying. At best, they can conceive of systems where everything about them is known and their outputs can be precisely predicted. At best. Complex systems are beyond them.
Add to this that we have a system which has slowly ground itself to a deadlocked halt. Congress has functionally abandoned most of its actual legislative duties because it's way better for sitting congresspeople to not pass any bills - if you don't do anything, then you don't piss any of your constituents off. Or make mistakes. And you can spend more time campaigning.
I left and became a hedonist with a drug problem after a very frank conversation with a colleague who was my political opposite at the time. I'm always open to being wrong, and hearing that they didn't have any answer either was a very 'welp, we're fucked' moment. I'm getting better.
As a software developer who found myself elected to state level public office and had to spin-up my education around the legislative process and all of politics, I concur.
Their are only a couple of things I'd add.
As much knowledge as I brought in about technology and the idea of being aware of system thinking, I also brought in a great amount of ignorance about all the other areas that are legislated (healthcare, interplay between local, state, and fedearl issues, budgetary concerns, tax policy, banking, etc.). Good legislation is truly collaborative.
Sadly, for the second part, good legislation is rarer than it should be as much of legislation is about politics and perception of the voters. And voter perceptions are not necessarily logical or reasoned.
This makes it all the more important, IMHO, that everyone who is reasonable, logical, and educated spend their precious, valuable time involving themselves to advocate for elected officials who behave similar in what is essentially a zero-sum game.
p.s. Have faith. I saw enough during my time that gave me reason for that faith. (But that faith requires time and effort -- we don't get good government or democracy for free.) I'm glad to hear you're getting better.
I'd say much good legislation is collaborative, but some necessary legislation is not. FDR's changes for instance. Industry did not want it. Arguably health care in the US needs this too.
Isn’t history full of examples of governments being slow and seemingly incompetent? Standard Oil was broken up many years after everyone knew that they were a ruthless monopoly which made too much profits. Note also that it didn’t take senators to figure out the monopoly, it was everyone, including the voters, who did.
There is one big benefit that democratic governments have though. They have a monopoly on physical force.
Government does not actually “do” the executive action in everything. But government is a rough and messy consensus on the set of rules and constraints within which the “specific people doing concrete actions” act. Large scope actions that impact the public need to be within such rules and constraints. Historically CEO’s of large corporations have quite often acted so as to ignore said rules and constraints primarily for rapid aggregation of monopolistic power and concomitant profit. There is harm from monopolistic power in new and emerging industries hence government action via enforcement of rules and constraints is important. Individuals eg in the office of the AG may be the “specific people doing concrete actions” but they do it on behalf and with the full power and authority of the US Govt which acts through “specific people doing concrete actions”. It’s not an either or.
> I don't know, isn't history really just a series of specific people doing concrete actions?
That is like saying: “Aren’t human brains just series of neurons, firing at specific moments.”
History is as much—if not more—about interactions between people, feedback loops, collective actions, collective reactions, environmental changes, etc. I would argue that the individual is really really insignificant next to the system this individual resides under, and interacts with.
I think you're agreeing while arguing against my point :-)
Like what is collective action, really? It's as you say - a series of individuals interacting, setting a course in a feedback loop, and then more people getting on board with that. It doesn't just happen, it takes individual instigators. It also doesn't just self-propagate. You need individuals, typically the same individuals, to show up regularly and consistently and make sure stuff happens and people are doing what they need to be doing.
What you are arguing for is a philosophical position called atomism or reductionism, (which contrasts with holism). It is a rather old school philosophy honestly (with holism also being old school, but not quite as much) as we are are learning more how important interactions are really to study anything honestly.
Modern philosophy of science kind of rejects the notion that you can study anything really by only looking at the atomic structure of it. This is to say, you can’t really study history by only looking at the actions of individual actors. Even in particle physics you have the 4 fundamental interactions, you have virtual particles, etc. not just. This isn’t to say that fermions and bosons aren’t important, it is just that it is hard to describe any physical phenomena without looking at the interactions between them. And in fact, by studying those interactions, you can derive certain laws and behaviors. History is no different, except the complexity is many many orders of magnitude greater.
Is one difference in the analogy that individual humans have independent agency and do concrete things in the world, where particles don't have agency, and don't really 'do' actions in a sense that is meaningful outside of the system they're in?
I guess by questioning the analogy I get back to my point. Things don't happen in history because of (truly) random behaviours converging on some emergent effect, like in a system of particles. They happen because specific unique (wrt the system) individuals make decidedly non-random decisons to affect reality on purpose (even if cause and effect are not that predictable it still holds that the actions are purposeful and do affect reality) in some way.
I question your assumption that in history “things don't happen in history because of (truly) random behaviors converging on some emergent effect.”
Firstly there is currently a debate among quantum physics as to the true randomness of what we observe[1][2]. Turns out we don’t actually need true randomness in our models, they just need to appear as if they are random, in other words, the chaotic nature of the system is more important than true randomness.
Secondly, society is an emergent effect of individuals behaving in a chaotic manner. So is zeitgeist. To study history without looking at societal changes over time, and without accounting for zeitgeist, is bound to yield a pretty limited insights.
I am aware that my analogy between quantum mechanics and the study of history is flawed. The latter is infinity more complex than the former, and deriving laws and creating models is a good fit to study the former but extremely difficult to study the latter. However my point is merely pointing to the fact that atomism (and holism for that matter) is an incomplete philosophy of science, that doesn’t even work in our most fundamental scope. One should be cautious when applying it to the study of history.
Or to put it in other words: While ρ(λ | ab) ≠ ρ(λ) is a real possibility in quantum physics likewise it is highly likely that the probability of individual acting in society is not independent from the probability of the same individual acting outside of the influence of that society. In your original point government may very well be like the ab in this famous conjecture. I would be careful when removing it’s influence.
PS: Sorry to cite youtube videos, but I’m not a physicist and these videos are the only way for me to understand the science. Otherwise I would be citing something I don’t understand, which I don’t want to do.
I think the parent point is that democracy is an illusion in that only few people have any real power, and the masses choosing government representatives is very different from the masses choosing policy.
Did the population at large want National Health, sanitation, vaccination and the EPA? Or did a small group of people in government decide that was best for everyone. I suspect the latter, especially when looking at the National Health system that doesn’t seem to help as well as other systems. You’re right it’s not all “the feels”, but a lot is about making the population feel like they are looked after without doing much to actually look out for their best interests.
I’m not saying they weren’t popular, but that if you had actually implemented what the public wanted then you would have ended up with universal healthcare.
Same with the EPA. It’s an improvement, so people like it, but it’s not what people actually want.
The actions that idnoviduals take on behalf of government are a direct reflection of the "abstract" policies and laws of that government. If you cannot discern this from 20th century history I don't know what to tell you.
Policies and laws aren't abstract, they're a good example in fact of what I said - things that have a concerete effect on reality typically authored by a small number of specific individuals within a government.
Yeah I think so, I also misinterpreted your quoted 'abstract' I think :-)
Basically my main point (or question really, I am not sure in it) is that we should resist thinking about government as an abstract entiry different in character from any org - really what it does or looks after is just, in the end, some small set of humans doing some actions that have some effect.
They're democratically elected yes, but that is a bit meaningless in the practical detail of any one given situation. In a sense it's not different, safer, or better, than some company led by some small set of individuals also makikg concrete decisons, for any one concrete decision.
Maybe democracy and policy has some aggregate influence over all decisions, making them lean in a certain way. But it's not like 'the government' as an entity is one thing led by a concrete conciousness or plan. Does that make more sense?
The government can't even execute on a bipartisan motivation to ban Tik-Tok. They get greedy and draft the bill to strip away all rights of citizens to have any digital privacy or VPNs, and to give themselves the power to declare any app illegal at any time. The market can't save us, and the federal government definitely can't.
It is difficult to shake the suspicion that the advocates of banning TikTok are using it as pretext for their real goals.
It wouldn't be so terrible if their real goals were limited to just restricting social media in general, but beyond that...
The real benefit of having a Congress for broader society is that it forces at least partial exposure of these unmentioned goals.
In that sense the federal government is a wonderful invention, but the drawbacks are so many that it doesn't seem all that wonderful overall.
The government can't even move on the fact we unnecessarily change the clocks twice a year. It's the 21st Century and werewe're still "motivated" by an 19th Century concept.
Sure but in my country (USA) the government is hopelessly inept at regulating technology. We still don’t have privacy regulations and now to work around this they’re trying to ban specific foreign apps instead of protecting us from all apps! I’d honestly be horrified if they tried to regulate AI. They would be in bed with Facebook and Microsoft and they’d somehow write legislation that only serves to insulate those companies from legal repercussions instead of doing anything to protect regular people. As far as I can tell it is the view of congress that big tech can to whatever they want to us as long as the government gets a piece.
Congress is already in bed with Meta, who is driving legislating away their competition (TikTok) or taking over the US version. Political donations aside, it should be illegal for congress to due insider trading or investing in companies.
Agreed. The US has backslidden since the 20th century back towards an elitest Republic and away from democracy. But even in the US, collective action has a better track record than "altruism".
Sometimes I wonder if the back slide narrative is really accurate, or if we’re looking back at the myth of history rather than the facts. When the country was founded, only white men could vote and people of color were legally property with no rights. That’s obviously not democracy, so I question at what point after that but before today we really had democracy to have slid back from.
Think of democracy as multivariable. One variable is the percentage of the population that are enfranchised. The other is how responsive the political, legal and economic systems are to the needs of everyday people.
America started as an elitest Republic. Slowly, variable #1 grew and shrank in fits in starts but variable #2 changed very little. Until we get to the 1930s and then variable #2 explodes wide open. Then in 1960s variable #1 breaks wide open as well.
By the 70s we are probably at the high point of both. Since then #1 has significantly eroded. And now with the court punches at the voting rights act #2 is now under threat as well.
I don't mean it in the usual internet guy sense. I don't see America as having a pure past. I believe America was an elitest Republic with very slow steps towards democracy. To me America only turned the corner towards becoming a democracy in the 1930s. It was a bumpy up and down from there with a slump in the last 40 years.
When we realize it’s really only about from the 1970s that we had full enfranchisement and political participation of all citizens, this becomes more obvious. “Coincidentally,” this enfranchisement was followed by the Volker shock and then the Reagan administration, both of which led to the decimation of labor’s political power and share of the economic pie.
> This is the job of the government. The ultimate effect of virtue wanking CEOs and corporate governance is to deceive people into thinking democracy is something that can be achieved by for profit organizations
Sam Altman says the opposite of what you're insinuating, if by "virtue wanking CEO deceiving people" you are referring to the subject of the article, Sam Altman. He says he wants a global regulatory framework enacted by government and decided upon democratically.
CEOs and companies can, and should, act ethically. Not just because it's the "right thing to do", but because it's the best way to guarantee the integrity of the brand in the long term.
CEO's and companies can act ethically while that aligns with the interest of the shareholders. Reality is that at some point, this becomes impossible even for those with the best intentions. "Do no evil" rings any bells?
To be fair, Google has done a lot of shitty things, annoying things, short-sighted things, even unfair things. But I can't, offhand, think of anything "evil." Like, we're purposely going to fuck with this guy (or group). Examples?
> CEOs and companies can, and should, act ethically.
I can and should always drive the speed limit. But that doesn't mean I do, which is why highway patrol exists, to keep people in check. "Should" is such a worthless word when it comes to these discussions because if you believe that an executive needs to act a certain way but you don't believe it enough that some sort of check is placed on them, then you must not believe in importance of their good behavior that strongly.
I feel I made it clear in my post that indivual integrity matters and has real consequences. But you as a citizen have 1 no way of validating a CEOs real intentions and 2 no recourse when that CEO fails to live up to those intentions. If you only fight for the protections you want once you need them, you will be at a serious disadvantage to win them.
I'm not sure you could ever say that a company can act ethically. People within the company may act ethically, but the company itself is just a legal entity that represents a group of people. The company has no consideration of ethics to act ethically.
A company that is composed of 100% ethical actors may one day have all their employees quit and replaced with 100% unethical actors. Yet the fundamental things that make the company that company would not have changed.
The third way is to build AI technology that empowers the individual against state and corporate power alike. Democracy got us here. It cannot get us out.
Given that we don't have such AI technology at present, would it not be prudent for us to assume that it may not be available imminently and plan for how we can address the problem without it?
AI tools are built and controlled by corporations with a profit motive. They aren't in the business of empowering anyone. If they do then it's just a side effect.
That's not a stable equilibrium. Blogs gave individuals asymmetric control over disseminating information - it didn't last. If you don't create institutions and power structures that cement and defend some ability of individuals, it will decay as that power is usurped by whatever institutions and power structures benefit from doing so.
The third way is to build AI technology that empowers the individual against state and corporate power alike
Ha, I'm OK with that as long as I get to pick the individual!
I mean, an AGI under the control of some individual could indeed make them more powerful than a corporation or even a state but whether increases average individual freedom is another question.
I agree that it's the government's role but I think you can look a bit beyond the law itself, which is often hard to get right, especially in very fresh new domains. Some nice behaviors can be induced by mere fear of government intervention and fear of future laws, and I think we're seeing some of that now.
It’s all about risk and rewards, nobody who owns openai is going say let’s pause. It also never stopped the country that first invented nuclear bomb, sure they could paus and then Russia would have done it and would have it first, and then said “thanks for pausing”
I don't know why people think nukes are a good example here. Nukes were outright birthed within the government within that government at its height of intervention into the market, at the height of its reach into the daily lives of every American, at the height of American civic engagement.
Policy makers spent a huge amount of time creating a framework for them. Specifically there was a huge debate about whether they should be under the direct control of the military. The careful decision to place them in civilian control under the Department of Energy is probably part of the reason they haven't been unleashed since.
This is kinda weird. It's not illegal to be an asshole. You also don't want to live in a country where it's illegal to be an asshole. You also don't want to live in a country where everyone is constantly an asshole.
What are you actually asking for here? Have you thought through the implications of what you're saying here?
The way we normally deal with assholes is we shame them. But then there's people like you saying we can't expect anything better from certain kinds of assholes. Well, yeah, when you tell an asshole to keep on being an asshole, what do you expect.
They're saying that companies are much more likely to be consistently ethical if they're forced to by the government, assuming the regulations that the government makes are themselves ethical. Even if you accept that Sam Altman, or any CEO, has the people's best interests at heart, it doesn't matter, the people shouldn't have depend on the CEOs of a powerful companies being a nice people to ensure they're not harmed. They're not saying it should be illegal to be an asshole, they're saying it should be illegal to run your company unethically.
Sounds like they're asking for government to take the reins w.r.t world impacting decisions rather than relying on the fortune of the big people in tech to make kind, thoughtful and well-planned decisions for our collective humanities' future.
>The way we normally deal with assholes is we shame them.
No, I don't, I avoid them -- and I find that tactic that you mentioned embraced more and more
commonly and purported even as the preferred method as if that's just the common sense reaction -- It's not, and I disagree with that behavior vehemently .
I think society has to figure out how much power you give one person or a small group of people. whether it's an emperor, chairman, priest, a CEO, a guru, etc.
>> The higher that niceness exists in the social order, the greater its contemporary benefit
Could also be said that the more niceness, the more likely that not-nice actors will violently overthrow the nice ones.
As an example, we have had numerous bee hives at our place. The bees that never mess with us when we come near their hives are more likely to get robbed by another hive and destroyed. The bees that would sting us when we came near were rock solid - they never got robbed or collapsed.
Seriously? You can't think of how it was useful that the government mandated that factory door must remain open to help prevent human deaths in case of a fire? You can't think of the advantages of governmental food and medicine safety obligations?
The government has had some wins, and it would be naive to say they haven’t.
They have also had many catastrophic failures.
It’s unclear whether in the end regulations trend towards net benefit, but it does seem likely that the more nebulous a problem, the harder it is for government to get it right. Or anyone, for that matter. But especially government because the feedback loop is so slow and bad.
The problem is that I don’t believe we have any organization in government currently staffed and active that I trust to take any action that will benefit the public at large.
The problem space is too confusing, and the people making decisions are too incompetent. It’s a huge skills and knowledge gap.
And that’s without factoring in corruption and bad intentions.
That's completely misguided.
Consumers purchase from companies they like.
Why do you think companies are all woke and virtue signalling? Because they interpreted the vocal woke minority as the voice of the country and they want to capture that market.
Corporations will absolutely try to do go in order to maximise their profit.
And this is ignoring private charities which get more done than any government has ever done.
Collective interest is a nice concept but the government, like all large organizations, is not capable of moving in any direction.
Whatever you need done, chances are someone's cousin will get a job, the job will be done poorly and the taxpayers will pay more in taxes to fix the problem again and again and again.
A government can't fail and it's therefore inefficient.
Your post ignores the existence of democracies. Governments fail all the time. In a democracy failures of government will often yield a total collapse and complete replacement. If a failure is spectacular enough, these failure often come with constitutional reforms or even revolutions.
In addition, democratic governments (and even many autocratic governments) have some levels of distribution of power. Your small municipal government may very well end up being absorbed into you neighboring municipality because it is more efficient. Maybe an intermediate judicial step is introduced at a county, or even country level.
Governments do try, and often succeed into making your freedom and your interaction with society at large as efficient as possible, while trying to maximize your happiness. (Although I’ll admit way to often democratic governments do act in a way that maximized the profits of the wealthy class more then your happiness).
The market will follow its inceotives. You shape those incentives with laws. If you want a market that allows people to take risks, you do that by inventing the limit liability corporation not by telling people to be nice and to not pursue their debts unto their debtors' personal property. If you want a market that discourages monopoly you do that by regulating combination not by writing articles about how "good businessmen" don't act anticompetitively.
If you're talking about the US (Fed) government then please don't hold your breath. They're not interested in the best interests of the many. They're too busy rubbing each other, counting their money and deflecting the blame elsewhere.
Will the act on AI? Yeah, probably. Will it smell of crony capitalism? Yes, absolutely.
While I don't disagree that having mostly good people generally isn't effective in long term security against all evil people, I also dispute the main claim somewhat.
> Stop asking the market to look after collective interest. This is the job of the government.
That's the thing: our government, society, is simply an emergent, collective agent of individuals. If most people don't actually have in mind the importance of altruism, then the whole thing doesn't work:
(a) The government can't watch our every little move (as-is);
(b) The more the government watches us, the more risks we take with authoritarianism (even totalitarianism);
(c) Having everyone extremely social-good disinterested is extremely inefficient: you increase policing costs, legal costs, regulatory costs, it increases complexity arbitrarily and makes the whole thing grind to a halt (see highly corrupt cultures);
(d) Moreover, the disinterest becomes dangerous because even if the mission of the government were to steer social-good disinterested parties to somehow achieve collective good societies (whose success I question), a disinterested population doesn't care to vote for well-meaning politicians, and doesn't have the ethical and social understanding to make good choices in this area. And then when the whole culture is social-good disinterested, it's hard to imagine somehow all government workers would be magically pro-social angels, and not as corrupt as the rest of the population.
In conclusion, the entire backbone of a society is ethical enlightenment (of course, alongside other things like effectiveness, and ethical effectiveness). If individuals themselves are not ethically enlightened, no social good outcomes are possible, be it from governments, markets, or various organizations like non-profits and coops.
I think the market would be looking after our collective interest if this enlightenment was more widespread. And then we could better discuss mechanisms (and general structural innovations -- which I think are very important) to keep the few inevitable problem cases and unenlightened psychopaths from ruining it for the rest of us :)
In a way, societies are made of goodness (cooperation).
This is socialist nonsense. The government won't protect anything outside of their interests[0]. Free markets are good and necessary for human flourishing[1].
In general governments have done far more harm to individuals than anyone in the market. The job of the government is to protect your individual rights not the collective interest.
That's just quibbling over what "your individual rights" are; where does the line get drawn between "exercising my right" and "having my rights infringed on by the actions of another." There is no shortage of harm done by "anyone in the market" today, whether it's currently illegal and we call it "crime" instead of just a person exercising their freedom, or whether it's harm that isn't regulated today.
At very least they both share immense responsibility for causing individual harm. Sure the government may start a war, but that war can’t happen without bombs and bullets, and in America at least those factories aren’t run by the government. There is an intermediate step oftentimes, but I don’t think that necessarily disconnects companies from responsibility.
If you work at a guided bomb factory you may not be the person dropping it, but you are responsible for the destruction it causes in a small way.
Also, if global warming kills us all then it is likely that the oil companies bear some responsibility for it right?
Government sucks - I agree with that statement, but we shouldn’t act like corporations are appreciably less responsible.
They are worse together. Achieving some fine balance of corporations may seem somewhat utopian but we are pretty far from utopia in the current day.
Building a mega corporation without big government/s I would argue is basically impossible. And local level governance is more likely and potent without big government. Again though, all of that is quite hard to achieve / see how to achieve when people with existing power enjoy the status quo control more of the levers than the masses, including the ones used to influence the masses.
If nobody were able to socialise the cost of going in a foreign country and killing people, there would be no war.
If the government didn't steal my money against my will on threat of incarceration, there is no way in hell I'd spend my money on bullets to kill someone's son in another country.
In the 20th century governments slaughtered wholesale around 100 million of their own citizens (China, Russia, Cambodia), let alone those of other countries they killed in war. There's no measure by which the market comes anywhere near that amount of murder.
The market (german industrialists in case of Hitler, military-industrial complex in case of USA, East India Company in case of GB if you want to go deeper in history) has its quite significant share. Ignoring that is being willfully ideologically blind.
The absence of government just leads to a situation in which some group takes control of a given area. In effect government will then exist again. During the absence of government there will be chaos and rampant crime.
>Besides, there is an alternative model where there are competing groups of people and I can pick the best among them based on price and services.
In the absence of government, what's stopping these groups from simply joining forces into a cartel, getting some armed thugs and making you an offer you can't refuse? History suggests that to be a far more likely scenario. Oligarchy, rather than competition, is the natural state of capitalism.
The Constitution is just a piece of paper. For every thousand steps the Congress, regulators, and state assemblies take in the direction of tyranny, the courts claw back one or two.
It's nice if people are nice but it is not a bulwark of the collective good. It is a temporary social convenience. The higher that niceness exists in the social order, the greater its contemporary benefit but also, the more it masks the vulnerability of that social benefit.
It matters if Sam Altman is Ghandi or Genghis Khan in a concrete way but you, as a citizen must act as if it doesn't matter.
If AI poses a danger to the social good, no amount of good guy CEOs will protect us. The only thing that will is organization and direct action.