Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
EM-LLM: Human-Inspired Episodic Memory for Infinite Context LLMs
(
github.com/em-llm
)
37 points
by
jbotz
2 days ago
|
hide
|
past
|
favorite
|
2 comments
mountainriver
2 days ago
|
next
[–]
TTT, cannon layers, and titans seem like a stronger approach IMO.
Information needs to be compressed into latent space or it becomes computationally intractable
reply
MacsHeadroom
2 days ago
|
prev
[–]
So, infinite context length by making it compute bound instead of memory bound. Curious how much longer this takes to run and when it makes sense to use vs RAG.
reply
Consider applying for YC's Summer 2025 batch! Applications are open till May 13
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
Information needs to be compressed into latent space or it becomes computationally intractable
reply