Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Been seeing this argument more and more. Is there a name for it?

“Bad thing X has been happening forever.

AI _guarantees_ X will exponentially get worse, but it lets me do (arguably) good thing Y so it’s okay.”





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: