I don't think AI trained on my creative output (not just code, mind you) is a problem per se.
What is a problem, in my opinion, is the tendency of large corporations and small circles on top of these to monopolize access to these models, and if some of the functionality gets available to the public, it's going through a very paternalistic, corporate, puritan censorship pipeline.
If you train artificial intelligence on our hard-won data, the resulting artifact should be available to us. StableDiffusion executed very well here.
Problem is, there may be no stability.ai for general-purpose multimodal AIs that are coming this decade, and this technology is a dystopia fuel when it's owned by a select few.
What is a problem, in my opinion, is the tendency of large corporations and small circles on top of these to monopolize access to these models, and if some of the functionality gets available to the public, it's going through a very paternalistic, corporate, puritan censorship pipeline.
If you train artificial intelligence on our hard-won data, the resulting artifact should be available to us. StableDiffusion executed very well here.
Problem is, there may be no stability.ai for general-purpose multimodal AIs that are coming this decade, and this technology is a dystopia fuel when it's owned by a select few.
AI should be democratized.