Hacker News new | past | comments | ask | show | jobs | submit login

I don't think AI trained on my creative output (not just code, mind you) is a problem per se.

What is a problem, in my opinion, is the tendency of large corporations and small circles on top of these to monopolize access to these models, and if some of the functionality gets available to the public, it's going through a very paternalistic, corporate, puritan censorship pipeline.

If you train artificial intelligence on our hard-won data, the resulting artifact should be available to us. StableDiffusion executed very well here.

Problem is, there may be no stability.ai for general-purpose multimodal AIs that are coming this decade, and this technology is a dystopia fuel when it's owned by a select few.

AI should be democratized.




This reminds me of Lex Fridman talking with Jaron Lanier about Data Dignity https://youtu.be/Fx0G6DHMfXM?t=2609




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: