The market is different, and so is the supply. The market for artisanal cutlery is basically an art market. The programmer supply today is an approaching-standardization factory worker. There IS an art market for software, in the indie gaming space, so perhaps that will survive (and AI could actually really help individual creators tremendously). But the work-a-day enterprise developer's days are numbered. The great irony being that all the work we've done to standardize, framework-ize the work makes us more fungible and replaceable by AI.
The result I foresee is a further concentration of power into the hands of those with capital enough to own data-centers with AI capable hardware; the petite bourgeoisie will shrink to those able to maintain that hardware and (perhaps) as a finishing interface between the AI's output and the human controlling the capital that placed the order. It definitely harms the value proposition of people who's main talent is understanding computers well enough to make useful software with them. THAT is rapidly commoditizing.
> The great irony being that all the work we've done to standardize, framework-ize the work makes us more fungible and replaceable by AI.
I mean, at some level, this is what frameworks were meant to do: give you a loose outline and do all that messy design stuff for you. In other words: commodify some amount of software design skill. And I’m not saying that’s bad.
Definitely puts a different spin on the people that get mad at you in the comment section when you suggest it’s possible to build something without a framework though!
Since AI has been trained on the generous gifts of the collective (books, code repos, art, ..), it begs the question why normal societies would not start to regulate them as a collective good. I can foresee two forces that will work against society to claim it back:
- Dominance of neoliberalism thought, with its strong belief that for any disease markets will be the cure.
- Strong lobby from big corporates.
You don't want to intervene to early, but you have to make sure you have at least some limits before you let the winners do too much damage. The EU has to be applauded for having a critical look on what effects these developments might have, for instance which sectors will face unemployment.
That is in the interest of both people and business, because the winner takes it all
means economic and scientific stagnation. I fear that 90% of the worlds' data is already in the hand of just a few behemots, so there is already no level playing field (which is btw caused by aforementioned dominance of neoliberalism).
The sectors of work that have been largely pushed out of economy in recent decades have not been defended by serious state policy. In fact there are whole groups of crucial workers, like teachers or nurses, who are kept around barely surviving in many countries. The groups protected by the state tend to be heavily organized and directly related to exploitation of natural strategic resources, like farmers or miners.
There is no particular sympathy towards programmers in society, I don't think. Based on what I observe calling the mood neutral would be fair, and this is mostly because the group expanded, and way more people have someone benefiting from IT in their family. I don't see why there would be a big intervention for programmers. Artists maybe, but these are proverbially poor anyway, and the ones with popular clout tended to somehow get rich despite the business models of culture changing.
I am all for copyright reform etc., but I don't see making culture public good, in a way that directly leads to more artisanal creators, as anything straightforward. This would have to entail some heavier and non-obvious (even if desirable) changes to the economic system. It's debatable if code is culture anyway, though I could see an argument for software, like Linux and other tools.
> I fear that 90% of the worlds' data
Don't wanna go into a tangent in this already long post, but I'd dispute if these data really reflect the whole knowledge we accumulated in books (particularly non-English) and otherwise not put into reachable and digestible formats. Meaning, sure, they have these data, they can target individual people with private stuff they have on them, but this isn't full accumulation of human knowledge that is objectively useful.
> There is no particular sympathy towards programmers in society, I don't think.
The concern policy makers have is not about programmers, but about boatloads of other people having no time to adapt to the massive wave these policymakers see coming.
There a strong signals that anyone who produces text, speech, pictures or whatever is going to be affected by it. If the value of labor goes down, if a large part of humanity cannot reach a level anymore to meaningfully contribute, if productivity eclipses demand growth, you simply will see lots of people left behind.
Strong societies depend on strong middle classes. If the middle class slips, so will the economy, so no good news for blue collar as well. AI has the potential to suffocate the organism that created it.
>AI has been trained on the generous gifts of the collective
Will be interesting to see how various copyright lawsuits pan out. In some ways I hope they succeed, as it would mean clawing back those gifts from an amorphous entity that would displace us (all?). In some ways I hope that we can resolve the gift problem by giving every human equity in the products produced by the collective value of the training data they produced.
>winner takes it all means economic and scientific stagnation
Given the apparent lack of awareness or knowledge of philosophy, history, or current events, it seems like a tough row to hoe getting the general public on board with this (correct) idea. Heck, we can't even pass a law overturning Citizens United, the importance of which is arguably even less abstract.
When the tide of stupidity grows insurmountable, and The People cannot be stopped from self-harm, you get collapse, and the only way to survive it is to live within a pocket of reason, to carry the torch of civilization forward as best you can.
> When the tide of stupidity grows insurmountable, and The People cannot be stopped from self-harm, you get collapse,
Yes, people are unfortunately highly unaware of what societal ecosystem they depend on, and so cannot prioritize on what is important. These topics don't sell in media shows.
The result I foresee is a further concentration of power into the hands of those with capital enough to own data-centers with AI capable hardware; the petite bourgeoisie will shrink to those able to maintain that hardware and (perhaps) as a finishing interface between the AI's output and the human controlling the capital that placed the order. It definitely harms the value proposition of people who's main talent is understanding computers well enough to make useful software with them. THAT is rapidly commoditizing.