Hacker News new | past | comments | ask | show | jobs | submit login

The idea that intelligence is disconnected from the "baser" instincts like emotion, need to eat, need to reproduce, need for social recognition, etc is probably just false. Akin to there being a useful measure IQ which predicts ability to solve the worlds problems (hint: Nope). Our story-telling mind can construct all kinds of intelligent hypotheses, but was probably evolved to appear rational to our fellow people and attribute agency where possible. Our wander-through-the-woods mind can visualize and hypothesize about spatial relations and transformations, etc.

There's much to do for AGI, but I believe that motivation-engineering will be the hardest part. Morality is intrinsically connected to our role as sorta-hive-minded monkeys.

Caveat: All the above is poorly presented opinion from the following resources:

- Learning how to learn on coursera

- Buddhism and modern psychology on coursera

- Righteous Mind by Haidt.

- Happiness Hypothesis by Haidt

- Bullshit as an honest indicator of intelligence

Building a machine using our evolutionary history as a prior design is the only way we know how to produce general intelligence, but all the strange, varied, "Emotional" baggage that goes with it means we never would. Why would a computer be afraid of snakes? If what you want is a computer that can come up with solutions you wouldn't have imagined, then you need clever search, problem specification, and significant computation, not general intelligence. If you want to automate something, you may need learning, but don't need intelligence.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: