Hacker News new | past | comments | ask | show | jobs | submit login

that is certainly a very plausible and worrisome risk, and i agree that facial recognition doesn't require strong ai

but i think there are other, possibly larger risks that result from strong ai in weapons, and plausible incentives to put it there




Can you give some examples of those risks?


like biological weapons, its collateral damage might be a lot less predictable, because it's hard to predict what someone who's smarter than you will do




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: