Hacker News new | past | comments | ask | show | jobs | submit login

That's exactly what ChatGPT is. It's specifically trained as a next word (well, token) predictor. It has no long term plan, only a local window of context. And inference is completely predict the next token, with a large scales stats prediction model.



The long term plan is what generates the next word.


There is a sliding local, not global, window. There is no long term plan. It does not include context past the window. The GPT4 page states the max context is 32768 tokens.

Thus it is local. It does not see anything past this.

https://platform.openai.com/docs/models/overview


The long term plan stays internal to the network, only the next word is outputted.


If you want to call a local plan long term go ahead. People will continue to correct you.

It has no "plan" other than local context. I posted their own statements on it.

I don't know how much simpler it clearer I can make it.

A human can make a long term plan, and write unlimited words about it, and still leverage any part of unlimited context. This is a long term plan. A human can reference 1 token back, 10000 tokens back, 1 million tokens back, any number of tokens back.

No LLM can do this, since they have limited context. The site I just posted from GPT4 themselves gave you the context length.

Have you ever coded a LLM? Ever read one of their papers and understood it? Do you understand what the word "context" means?

We're done.


Tokens in GPT are basically short term memory, just like people have short term memory. You can't reference a million tokens back, it's been compressed into your long term memory. For GPT the long term memory is the trained network itself, of which it contains vastly more knowledge than any of us alone.

I will say that writing perfectly working application from a prompt requires a plan, even if the output is the next word, there is a higher level plan you're not seeing in the output. This isnt sophisticated markov chains, it's understanding within the network that generates output.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: