Hacker News new | past | comments | ask | show | jobs | submit login

As you approach the limit of intelligence the danger that you'll realise altruism works for the whole but not the individual must increase, right?



Are you saying that as people get smarter they're less likely to think satisfaction in life comes from being someone who makes the world better? That they're more likely to think that satisfaction in life comes from getting more stuff than the next guy?


I think if you engineer for maximum intelligence, yes.

I want to make the world better but it's a bad use of my time from an individual perspective.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: