They DO take the same care if not more care, the problem is that just like with copyrighted content stuff slips through because stochastic text generation is impossible to control 100%.
I've had the most innocuous queries trigger it to switch into crisis-counseling mode and give me numbers for help lines. Indeed, in the original NYT article it mentions that this man's final interactions with ChatGPT did trigger ChatGPT to offer the same mental health resources:
> “You are not alone,” ChatGPT responded empathetically, and offered crisis counseling resources.
The same care could equally have been taken to avoid triggering or exacerbating adverse mental health conditions.
The fact that they've not done this speaks volumes about their priorities.