Hmm. An AI trained to maximize dopamine could be a very bad thing. (It won't be stated that way. It will be trained to "maximize engagement", but it amounts to the same thing.)
> An AI trained to maximize dopamine could be a very bad thing.
Spelled "profitable". This is definitely something that's already happened/happening; see algorithmic timelines and the widespread sudden legalization and acceptance of gambling.
Our brains have been under attack for years. Zuckerberg, Dorsey, and company have already spent decades and billions doing just that. With capabilities already in the AI realm.