Honestly an idea I've had with the recent surge of interest in LLMs is some type of robotic pet. I feel like LLMs can understand human language and images(hopefully soon) to the extent that a pet should be able to. Some kind of robotic dog that can not only express emotions, but understand what you say to it, seems like it would be pretty successful if done correctly.
The nice thing is that it doesn't even have to understand perfectly, a GPT-4 level of vision/language understanding would be perfectly fine. It doesn't have to hold a conversation, just be able to be happy or sad when it should.
I feel like the big complication at the moment would be getting it to navigate outside of controlled environments. Mainstream tech is getting there (iPhone LIDAR mapping, Meta Quest room scanning) but there's still a lot of dots to connect for a robot far less inherently steady than a Roomba.
What I would bet we'll see at first is some of these little guys in Disney parks, in fenced-off areas or small stages that are kept meticulously clean by staff, where they can interact with visitors without needing to handle all the complications of the real world yet.
You'd be surprised how much a pet can understand.It's more like we can't understand them.They know all our ticks and can sense emotions before even we realize.
The nice thing is that it doesn't even have to understand perfectly, a GPT-4 level of vision/language understanding would be perfectly fine. It doesn't have to hold a conversation, just be able to be happy or sad when it should.