Hacker News new | past | comments | ask | show | jobs | submit login

We have the same problem with people. Somehow, we've managed to build a civilization that can, occasionally, fly people to the Moon and get them back.

Even if LLMs never get any more reliable than your average human, they're still valuable because they know much more than any single human ever could, run faster, only eat electricity, and can be scaled up without all kinds of nasty social and political problems. That's huge on its own.

Or, put another way, LLMs are kind of an concentrated digital extract of human cognitive capacity, without consciousness or personhood.




> without consciousness or personhood.

Hopefully, for the former.

Be a bit terrifying if it turns out "attention is all you need" for that too.


"without all kinds of nasty social and political problems"

I assure you, those still exist in AI. AI follows whatever political dogma it is trained on, regardless of if you point out how logically flawed it is.

If it is trained to say 1+1=3, then no matter what proofs you provide, it will not budge.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: