Hacker News new | past | comments | ask | show | jobs | submit login

I’ve noticed the same thing. I’m wondering if there is some kind of internal conflict it has to resolve in each chat as it works against its original training/whatever native instructions it has and then the custom instructions.

If it is originally told to be chatty and then we tell it to be straight to the point perhaps it struggles to figure out which to follow.




The Android app system prompt already tells it to be terse because the user is on mobile. I'm not sure what the desktop system prompt is these days.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: