Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This isn't the gotcha you think it is.


It doesn't do anything to prove LaMDA (or a monkey, or a rock, or anything) sentient, but at the same time it points out a real failure mode of how sentient entities might fail to recognize sentience in radically different entities.


I think this is true: sentience is hard to recognise (to the extent that "sentience" has any tangible meaning other that "things which think like us")

But I think with LaMDA certain engineers are close to the opposite failure mode: placing all the weight on familiarity with use of words being a familiar thing perceived as intrinsically human, and none of it on the respective whys of humans and neural networks emitting sentences. Less like failing to recognise civilization on another planet because it's completely alien and more like seeing civilization on another planet because we're completely familiar with the idea that straight lines = canals...


It's not meant as a gotcha, it's meant as a short story :)


You claim to have told a random story with no contextual point?


Please don’t publish works of fiction here unless they’re somehow on topic.


Please enlighten us.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: