children aren't independent from their parents pretty much by definition. They do have individual thoughts, but if their thoughts are concerned with their dependence, are those thoughts really independent?
I have raised two children. They are now in their twenties. From the get go, I dealt with them as beings who did things for a reason, and that reason was generally assumed to be about dealing with their needs, not "doing" something to me. Many parents expect kids to "behave" and that definition of "behaving" is rooted very much in what adults see and think about the child, not what the child is experiencing. This is inherently problematic.
Children may be dependent in many ways on their parents, but once they are outside the womb, if the parent dies, the child does not automatically die. They are a separate being. They have separate experiences. Their reasons for doing things come from their first person experiences.
Then parents very often try to impose third person motivations -- people-pleasing expectations -- that frequently interfere with the child pursuing its own needs.
We need to get better at dealing with kids as separate entities if we want to have any hope of dealing with machines functioning independently.
Your remark just reinforces my opinion that people do this badly. You think dependence is a given and I am not even sure how to go forward with this conversation because of this stated assumption.
>You think dependence is a given and I am not even sure how to go forward with this conversation
Well, even you evidently are dependent. At least, if you telling me what you think is to some degree intended to solicit a response from me, your conversation depends on my answer. Humans are social, which is by definition pretty much not independent. Sure, this is splitting hairs over the meaning of dependence, but I didn't get what that's to do with AI, anyway. Surely, the AI depends on its design while the design constraints, the laws of nature if you will, are a given.
but I didn't get what that's to do with AI, anyway.
Humans get a lot of feedback other than explicit algorithms as to how to act or behave or what to do. A lot of that is social feedback and a lot of the expectations are about what other people think, in essence.
If you want an individual item with AI to be functional and "intelligent," we need to be able to write algorithms that work without that extra stuff. In order to effectively write those algorithms, we need to be able to think differently about this problem space than most people do.
Yes, conversation is inherently dependent on another party being involved. It isn't conversation if you just talk to yourself. Conversation has the capacity to add value.