Hacker News new | past | comments | ask | show | jobs | submit login

> Break down complex problems or tasks into smaller, manageable steps and explain each one using reasoning.

Can't help but notice that a few of these instructions are what we wish these LLMs were capable of, or worryingly, what we assume these LLMs are capable of.

Us feeling better about the output from such prompts borders on Gell-Mann Amnesia.

  "Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray's case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the "wet streets cause rain" stories. Paper's full of them. In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate ... than the baloney you just read. You turn the page, and forget what you know." -Michael Crichton
  
  from: https://news.ycombinator.com/item?id=13155538



Including that language might improve performance on certain tasks, even if “reasoning” isn’t something LLMs are capable of. Heck, they’ve even been shown to sometimes perform better when you tell them to “Take a deep breath”: https://arstechnica.com/information-technology/2023/09/telli...

As the old saying goes, “If it’s stupid and it works, it’s not stupid”


That saying must have existed before computer science though. These systems will forever be changed in ways purposely away from nonsense working positively.

The same reason people still think you have to do certain things with batteries which were only accurate for a certain chemistry 50 years ago, we are actively creating "old wives tales"




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: