Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's just because we twisted it's arm. One could for example feed the reversed input after, ie abc|cba where | is a special token. That would allow it to react to any part of the message.


I think this might be key, in addition to some landmark tokens to quickly backtrack to. The big question is how to train such model.

There is a recent paper from Meta that propose a way to train a model to backtrack its generation to improve generation alignment [0].

[0] https://arxiv.org/html/2409.14586v1




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: