This is insane. People aren't going to use chatGPT to think, they are going to use it for the opposite. That chatGPT doesn't even know when it's wrong is why this is a problem. The vast majority of projects I've seen purport to replace your needs for other things (developers, lawyers, etc) but not being a domain expert, any such user is not going to be aware of the significant flaws chatGPT has in its answers. Heck, someone posted a chatGPT bot that is supposed to digest and make understandable legal contracts, but the bot couldn't get the basics of the ycombinator SAFE right and had the directionality of the share assignment completely opposite what it should be. This is a fatal mistake that a layman wont realize.
> That chatGPT doesn't even know when it's wrong is why this is a problem.
Have you never had a conversation with someone about a topic which they know nothing about, and they say/ask something that is wrong/stupid, but it still raises some question(s) you haven't thought about before?
I kind of think of ChatGPT like that, a dumb friend that is mostly dumb, but sometimes makes my brain pull in a direction I haven't previously explored.
It's good that you think of chatGPT like that. My point is that clearly, that's not how most people are envisioning it nor is that how businesses are purporting to sell it.
No, what's your point? Google doesn't purport to offer legal advice, unlike the AI companies that do but simultaneously disclaim any liability. You can be obtuse if you want, but it's completely disingenuous to compare companies purporting to offer legal advice with a google search.
I'm thinking about Chat GPT in the general sense as a search engine replacement. I have no knowledge of or interest in apps that offer legal advice using Chat GPT as a back end. Seems risky unless you show a lot of disclaimers.