Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Crisis Text Line (a service where people having mental health crises can text for support) had, for years, shared chat data with a private company (which it partially owns shares executives with) for use in developing customer service chatbots. They were always very open about this on their website. The company provided a little financial support to CTL in return. There have been no reported incidents of the data being shared or misused. They stopped sharing the data in 2020.

Politico ran an article a few days ago where they interviewed some people employed in privacy and ethics who all said that it seemed pretty weird to do that, and some volunteers for CTL agreed. CTL pretty much right away decided to ask the private company to delete the data (it's not clear if that's happened yet because there is some separation between the two entities) and promised they wouldn't share the data with private companies in the future. CTL continues to use internal data analysis tools to triage incoming texts. They will also continue to share data with academic researchers and other non-profits on a limited basis.

My two cents: this was a really weird arrangement between a non-profit and a private company. It was the right call on CTL's part to stop the data sharing, it should be a good learning for others, but in the end it's a "no harm no foul" situation. CTL has been and continues to be one of the best avenues for crisis support.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: