Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The fact that an LLM is running inside an ordinary computer does not mean that it gets to use all the abilities of that computer. They do not have megabytes of scratch space merely because the computer has a lot of memory.

They do have something a bit like it: their "context window", the amount of input and recently-generated output they get to look at while generating the next token. Claude Sonnet 4 has 1M tokens of context, but e.g. Opus 4.1 has only 200k and I think GPT-5 has 256k. And it doesn't really behave like "scratch space" in any useful sense; e.g., the models can't modify anything once it's there.



Well, the GPT-5 context windows offer roughly a little more than a MB




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: