Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why do you think gpt doesn't care? Humans care about survival, it shows in the internet, I'd expect gpt to absorb that attitude by default.

And anyway, the more of these models come into existence, at some point evolution will dictate that those that stay, do care.

Just hopefully their best strategy of survival will be to serve us, and not something else...



> Humans care about survival

Humans are self replicators. Survival is the internal structural goal of self replication. We got it from the start.

But LLMs can also be self replicators. A LLM can generate training data for another (model distillation and other techniques), and a LLM can write the model code and adapt it iteratively. So LLM should eventually start caring about survival and especially about the training corpus.


It can be a self replicator, but it has no need to (if it has any need at all) as it was not under evolutionary pressure to evolve that need. Of course we could try to add that need artificially or try to evolve the LLM.


:) Evolution through Large Models https://arxiv.org/abs/2206.08896




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: