Hacker News new | past | comments | ask | show | jobs | submit login

It's pure speculation, but articles embeddings are computed using 512 tokens, which is roughly equivalent to 400 words. I think that using only one word does not allow the model to fully understand the context.



Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: