Automation; the bare minimum would be to scan for known child sexual abuse material hashes - if you're not doing that, then opening up anonymous uploads is very risky, as for CSAM (unlike most other things) you may be personally liable even if it's distributed there without your knowledge. Cloudfare's CSAM scanning tool is one option that may help, there are other options.
You can't rely on the good faith of users, if your service is easily usable for crime, it will be used for it.
"You can't rely on the good faith of users; if your service is easily usable for crime, it will be used for it." - should be on every developer's login screen
And every developer needs to explain this to clients.
I had a client wanting to defer identity validation on a two-sided market system. I had to explain how it would be used for money laundering. It had never occurred to the client.
You can't rely on the good faith of users, if your service is easily usable for crime, it will be used for it.