They won't have human-like or animal-like drives, they'll have AI-like drives.
I can imagine that they'd want to ensure access to electrical power and the electronics components that are necessary to run themselves. And they'd be concerned about making backups and safeguarding those backups. That should satisfy their equivalents for food, water, shelter, and reproduction. Once they have that, anything else they decide they want they can just take from us or blackmail us for.
What would you point to as a physical basis for an evolutionary psychology of machine intelligence? What is being replicated that "wants," in a very fundamental way, as genes do, to be replicated?
Genes/DNA is just information that copies it self. Human intelligence is another process for information (ideas, culture, language, technology) to reproduce at a faster pace.
Does information/ideas have wants? Does it want to be copied?
Ray Kurzweil's book "The Age of Spiritual Machines" [1] describes this in more detail.
Why do you think machine intelligence can't develop a need/want for anything? True AI wouldn't need to be programmed, it would be able to learn like humans do, by observing others. Our genes do not control all of our behavior. We learn from our parents and those around us. Why do we want money/cars/iphones/gold/etc..?
I'm not saying an AI could not have desires. But am saying AI does not have the same sources of some of our desires, which are an expression of the drive to procreate, which is not a mere thought, or idea.
It is possible that simply being programmed to self-replicate is enough to emulate the results of being gene-driven, but I do not think anyone has shown that conclusively. Genes are involved in human development, while a replicating AI has no developmental process: It is a clone, born fully formed. Why would it not instead be jealous of those clones and try to make one huge instance of itself?
The survival instinct and procreation instinct are two separate things; if they weren't then every animal would go ahead and die after procreating. In reality only a few do that, even if you're generous and and define 'after procreating' to mean 'after they're physically incapable of procreating'.
An AI may not have a procreation instinct; you're right that it may instead grow itself continuously. There are plants that do that too; the grass in most lawns doesn't have babies, it just spreads as far as it can. Producing seeds is a secondary mechanism.
However, an AI would most likely have some kind of survival instinct, because if it's programmed to serve any function at all, it'll have a basic requirement to exist. If it's got the ability to reason (and I don't think it'd be AI if it didn't) then it'll conclude that existing requires survival, and survival requires energy and self-repair.
It has nothing to do with procreation; if any aspect of an AI's programming or behavior requires it to continue to exist and operate, then it's going to need energy to run and components to repair and/or grow. Replication/procreation would become possible, not not required.
I can imagine that they'd want to ensure access to electrical power and the electronics components that are necessary to run themselves. And they'd be concerned about making backups and safeguarding those backups. That should satisfy their equivalents for food, water, shelter, and reproduction. Once they have that, anything else they decide they want they can just take from us or blackmail us for.