This concept is already enshrined in law, the concept of free harbor. So long as a service provider doesn't do their own curation, they are not held responsible for the content that is posted. However, if they do curate, then they are responsible.
Applying this to Twitter, Facebook et al. is not that big of a leap.
> completely destroys the business model of several hundred billion dollar businesses
They are not entitled to their business model, especially not at the price of trampling upon something broadly considered to be an inherit human right.
>So long as a service provider doesn't do their own curation, they are not held responsible for the content that is posted.
Except they are held responsible if they don't curate. Look at laws like SESTA to see how platforms that don't self-curate content that could sexualize minors are legally liable.
I'm not saying SESTA is bad, I'm saying this idea that platforms need to be hands-off towards curation to maintain safe harbor protection is not true.
Which is exactly the point, that platforms who do not self-curate some types of user-generated content are not protected by safe harbor laws.
Your idea that safe-harbor laws only apply to platforms who don't self-curate is absurd precisely because there is illegal content they, the platforms, can be held liable for instead of the users.
Applying this to Twitter, Facebook et al. is not that big of a leap.
> completely destroys the business model of several hundred billion dollar businesses
They are not entitled to their business model, especially not at the price of trampling upon something broadly considered to be an inherit human right.