Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
DeepSeek's Multi-Head Latent Attention (liorsinai.github.io)
4 points by the_origami_fox 5 months ago | hide | past | favorite | 1 comment


Matrix absorption is unnecessary. What is needed is the order of multiplication associates towards the direction of the absorption. This and the modified Rope are needed to make the caching work.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: