Hacker News new | past | comments | ask | show | jobs | submit login

ChatGPT can be very confidently wrong in few words too. It's a very flexible system that way.



I started asking ChatGPT some rather technical questions about Australian drug laws. First I asked it what schedule common ADHD medications were on, and it answered me correctly (schedule 8). Then I asked it what schedule LSD was on, and it told me it wasn’t on any schedule, because it was an entirely legal drug in Australia. Uh, I hope it doesn’t some day tell someone that and they actually believe it, because they may be in for a very unpleasant experience if the police happen to encounter them acting on it




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: