A practical definition of "creativity" is "can create interesting things." It's pretty clear that machines have become more "creative" in that sense over the last few years.
I have yet to see ChatGPT or something similar ask a followup to clarify the question. They just give you a "solution". That's equivalent of a super bad junior dev that will cause more trouble than the amount they will solve.
That being said, I think we could make such a system. It just has to have training data that is competent...
Tell them to ask you follow up questions and they will.
Some systems built on top of LLMs have this built in - Perplexity searches for example usually ask a follow up before running the search. I find it a bit annoying because it feels like about half the time the follow up it asks me isn't necessary to answer my original question.
> Tell them to ask you follow up questions and they will.
That's rather missing the point. If your question makes no sense it will not ask a followup, it will spit out garbage. This is pretty bad. If you are competent enough to ask it to ask for followup then you are probably already competent enough to either not need the tool, or competent enough to ask a good question.
I have had chatGPT suggest that I give it more data/information pretty regularly. Although not technically a question, it essentially accomplishes the same thing. "If you give me this" vs. "Can you give me this?"