I have felt the same since Stable Diffusion came out.
The thing is, things have value in society partly because human efforts were involved in its making. It's not just about the end result; people still go to concert on top of listening to studio recordings for example, and people still watch humans play chess even though it's clear that good enough algorithms can beat the best humans easily. Technology like these which takes away too much immediate effort (hours needed to create the product) and long term effort (decades of training) are inherently absent of underlying value that I spoke of. Of course, if a person is only interested in consumption, it matters not how the "thing" is created.
Much of the sense of doom I have comes from the inherent erosion of this human effort element in the creative process. Whether we like it or not, the availability of mass produced content naturally threatens crafts themselves. After all, nobody wants to spend a few decades on their skills only to have their creation compared to an AI generated image produced in a few seconds.
I understand there are a lot of hypes around these technology to "humanity" but I have yet to see it. It just feels like more power consolidation to billionaires (especially when done as ClosedAI). There are artists who have tried to incorporate these but they have always felt the need to willingly not label their work as AI-generated or AI-assisted to sell (but still leaves in enough details for keen observers to tell it's AI touched).
As a whole, it just feels wrong. The most optimistic (and reasonable) take I have seen is "Just wait and see". It might feel like a non-argument, but it's the only realistic take between the hyped up techbros and the doomer cult (admittedly, I might belong to the latter group).
I think one of the most worrying thing for me is that regardless of how this plays out, this technology has only added more complexity to our society. That people are divided into camps about how they feel about the technology is simply a symptom about how much uncertainty there is in the future. This last bit will be a personal quarrel, but I personally lose any last desire to have children seeing the AI advancement. It's not right creating sentient life in an age where every year people have to play lottery to see whether technological advancement has deemed their life long effort unworthy.
I think you're right. A large part of the joy from creative endevours is actually getting good at something, and having other people enjoy your work. In the face of instant high quality generative AI placating the entertainment needs of the masses, we are creating a society where most people are unable to enjoy human creative expression, in part because human artists are just too slow. Attention spans are already shrinking, and after getting used to generative AI, few people will have the patience to wait for an author to write the second part of his magnum opus.
The thing is, things have value in society partly because human efforts were involved in its making. It's not just about the end result; people still go to concert on top of listening to studio recordings for example, and people still watch humans play chess even though it's clear that good enough algorithms can beat the best humans easily. Technology like these which takes away too much immediate effort (hours needed to create the product) and long term effort (decades of training) are inherently absent of underlying value that I spoke of. Of course, if a person is only interested in consumption, it matters not how the "thing" is created.
Much of the sense of doom I have comes from the inherent erosion of this human effort element in the creative process. Whether we like it or not, the availability of mass produced content naturally threatens crafts themselves. After all, nobody wants to spend a few decades on their skills only to have their creation compared to an AI generated image produced in a few seconds.
I understand there are a lot of hypes around these technology to "humanity" but I have yet to see it. It just feels like more power consolidation to billionaires (especially when done as ClosedAI). There are artists who have tried to incorporate these but they have always felt the need to willingly not label their work as AI-generated or AI-assisted to sell (but still leaves in enough details for keen observers to tell it's AI touched).
As a whole, it just feels wrong. The most optimistic (and reasonable) take I have seen is "Just wait and see". It might feel like a non-argument, but it's the only realistic take between the hyped up techbros and the doomer cult (admittedly, I might belong to the latter group).
I think one of the most worrying thing for me is that regardless of how this plays out, this technology has only added more complexity to our society. That people are divided into camps about how they feel about the technology is simply a symptom about how much uncertainty there is in the future. This last bit will be a personal quarrel, but I personally lose any last desire to have children seeing the AI advancement. It's not right creating sentient life in an age where every year people have to play lottery to see whether technological advancement has deemed their life long effort unworthy.