Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: LLM token window the limitation for replacing our jobs?
1 point by nik736 on March 14, 2024 | hide | past | favorite
Hey all! I obviously saw the Devin Demo and while I think this is just Vaporware at this point[0], it got me thinking what actually is missing to make it work. Such a tool probably should have access to most if not all of a codebase during a conversation to make good choices. Which means having the directory tree available at all times, knowing what is actually in some, not all, files and how to add features in an efficient manner. All this boils down to a huge token window, currently we are fairly limited with GPT4 through the UI it's still only 4K tokens, the API apparently supports 8K and 32K. But when making a useful "AI software engineer" we need at least what? > 10M?

What is the actual technical limitations for holding 10M tokens in a single conversation? Is there any hardware limitation which we are 5+ years away from having or is is getting as much GPUs of the current generation as possible and solve things on software side?

[0]: https://www.reddit.com/r/cscareerquestions/comments/1bd12gc/comment/kujyidr/




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: