Not disagreeing with you that LLMs are probably not sentient, but that is neither here nor there since lamda is more than a simple llm. There are significant differences between GPT3 and LaMDA. We gotta stop making these false equivalences. LaMDA is fundamentally more dynamic in the ways it interacts with the world: it constructs its own queries to ground truth sources to check it’s facts and then updates weights based on that (among many other differences). While it does incorporate LLMs it seems like people are in denial about the complexity and data access that lamda has relative to GPT3. In google’s own paper about lamda they demonstrated how it sometimes showed a rudimentary theory of mind by being able to reason about other’s perceptions.
Its a fundamental question of sentience that folks are commenting. I agree LaMDa has a better knowledge-base & open-domain information retrieval method.
In the words of Robert Heinlein, "One man's magic is another man's engineering" :)