Yes. I asked it to write a 50 word description of some text I gave it, it wrote a single 10 word sentence. I told it that was wrong, and to do it again, this time write a 50 word description, failed again. On the 5th time I did it myself. This is a basic example. I've been using ChatGPT for over a year and have love it up until now, it feels like it was utterly lobotomized. 1/2 the prompt is it repeating your question or prompt back to you.