Ask humans to tell you what "cat" means, and you'll receive as many answers as there are respondents. Some will derive from science, some from common experience; some will describe the relation of cats to their environment, others will talk about personal emotional connections with particular cats.
Ask a convolutional neural network what "cat" means, and the best you can get is a probability distribution of pixels on a grid. It's not intelligence, but just an encoding of facts provided by an actual intelligence.
No, you'll get the same kind of answer. It's not like one of the neural networks will write me a poem in response, on its own initiative. The form of the answer was decided by the human intelligence that created the neural net encoding.
The form of the human's answer was decided by the genetic code that led to the formation of the brain and the experiences the brain was exposed to up to the question. The brain is more complex by many orders of magnitude than your garden variety artificial neural network, so it is only expected that the range of possible answers is also broader.
Because they do tasks that people think require intelligence. It's like calling a water mill and a fusion reactor both devices that can generate energy.