Hacker News new | past | comments | ask | show | jobs | submit login

And to expand on myself. "Experience" means something very specific for humans. It means you have done something for a long time, and you have failed at it. And you have learned from experience. By definition, LLMs don't have any experience at all. They are trained and become a fresh "brain", and then they make the same mistakes, over and over and over, until they either get a new prompt that might correct themselves or are trained from scratch all over again.



What I meant is

1. LLM generates an idea and

2. the user responds positively or negatively or

3. the user tries the idea and comes back to continue the iteration, communicating the outcomes.

For example the LLM generates some code and I run it, and if it fails I copy paste the error.

That is the (state, action, reward) tuple which defines an experience.


Sounds like the LLM facilitated a human to gain experience, by making mistakes for the human and then correcting those mistakes also likely in an incorrect way. LLMs are effectively very very bad teachers.

The LLM given the same inputs tomorrow is likely to return similar responses. If a human did that they would likely be concidered to have some sort of medical condition...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: