Hacker News new | past | comments | ask | show | jobs | submit login

A plane, are you crazy? It's just a metal tube with sheets on both sides, a TUBE! Claiming it can fly like a bird is just anthropomorphizing.



I don't think that's a good analogy. We're talking about innate traits, not coarse functionality.

A plane and a bird can both fly, but a plane has no innate desire to do so, whether to take advantage of good soaring conditions, or to escape ground-based predators, etc.

An LLM and a human can both generate words, but the LLM is just trying to minimize repeating statistical errors it made when being pre-trained. The human's actions, including speech, are towards adaptive behavior to keep it alive per innate traits discovered by evolution. There's a massive difference.


  >"innate desire" 
"Innate" implies purpose, which is a human trait.

Humans built the plane to fly.

  >There's a massive difference.
There is 0 difference.

We built the machine to work; we have built prior machines - they did not work (as well), so we built more.

We are the selectivity your predisposition to nature argument hinges on.

And we won't stop til it stops us.


No - "innate" just means built-in.

An innate trait/behavior for an animal is something defined by their DNA that they will all have, as opposed to learned behavior which are individual specific.

An AI could easily be built to have innate curiosity - this just boils down to predicting something, getting feedback that the prediction is wrong, and using this prediction failure (aka surprise) as a trigger to focus on whatever is being observed/interacted with (in order to learn more about it).


> An innate trait/behavior for an animal is something defined by their DNA that they will all have, as opposed to learned behavior which are individual specific.

In that sense, most airplanes have an innate desire to stay in the air once aloft. As opposed to helicopters, which very much want to do the opposite. Contrast with modern fighters, which have an innate desire to rapidly fly themselves apart.

Then, consider the autopilot. It's defined "by their DNA" (it's right there in the plane's spec!), it's the same (more or less) among many individual airplanes of a given model family, and it's not learning anything. A hardcoded instinct to take off and land without destroying itself.

> An AI could easily be built to have innate curiosity - this just boils down to predicting something, getting feedback that the prediction is wrong, and using this prediction failure (aka surprise) as a trigger to focus on whatever is being observed/interacted with (in order to learn more about it).

It's trivial to emulate this with LLM explicitly. But it's also a clear, generic pattern, easily expressed in text, and LLMs excel at picking up such patterns during training.


> It's trivial to emulate this with LLM explicitly. But it's also a clear, generic pattern, easily expressed in text, and LLMs excel at picking up such patterns during training.

So try adding "you are a curious question asking assistant" to the beginning of your prompt, and see if it starts asking you questions before responding or when it doesn't know something ...

Tell it to stop hallucinating when it doesn't know something too, and just ask a question instead !


> which is a human trait.

Many animals share that trait.


Honestly I don't really care what current LLMs can do, I'm more interested in fundamental limitations of AI and I think the "it's just a file" argument is nonsense and the analogy makes sense in that regard.


I think you're focusing on the wrong part of his/her "it's just a file" argument. The actual point wasn't about the file packaging/form, but about the fact that it's just passive - just a function sitting there waiting to be called, not an active agent able to act out on it's curiosity.

Curiosity is a trait of an agentic system where curiously is driving exploration leading to learning.


I'm focusing on what they said. They said "an LLM, fundamentally, is a file".

Which is true, but the implication is that LLMs can't be agentic, which may or may not be true.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: