Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
PromptArray: A Prompting Language for Neural Text Generators (github.com/jeffbinder)
83 points by kelseyfrog on Jan 9, 2023 | hide | past | favorite | 15 comments


Dijkstra said that if we ever had the misfortune of having machines programmable in natural language, sooner or later someone would construct a formal language over it.

Playing out in real time. We're living legendary times.


> Dijkstra said that if we ever had the misfortune of having machines programmable in natural language

We have humans, and we spend a lot of time constructing formal specialized languages for specifying instructions, expected behavior, etc., for them (ironically, many of them, in the history of computing, specifically targeting the people whose job is to program computers), so that's... a pretty obvious prediction.


From the abundance of people who are still after this "programming in natural language" nonsense, it seems to me like repetita iuvant.


Love this idea, would like to read more. Got a source?


> It may be illuminating to try to imagine what would have happened if, right from the start our native tongue would have been the only vehicle for the input into and the output from our information processing equipment. My considered guess is that history would, in a sense, have repeated itself, and that computer science would consist mainly of the indeed black art how to bootstrap from there to a sufficiently well-defined formal system.

https://www.cs.utexas.edu/users/EWD/transcriptions/EWD06xx/E...


thank you!


https://www.cs.utexas.edu/users/EWD/ewd06xx/EWD667.PDF

> From one gut feeling I derive much consolation: I suspect that machines to be programmed in our native tongues—be it Dutch, English, American, French, German, or Swahili—are as damned difficult to make as they would be to use.


This is interesting and it would be neat if it was used with GPTChat. On the other hand this should be benchmarked on multiple interactions with GPTChat or just people that understand how to write prompts in general. Having the computer do basically all possible inputs into the generator could possibly be programmed as a layer between the underlying Neural Text Generator.


This is exactly the same sort of "wasted effort" that folks argue against when discussing space exploration, lunar or Martian colonization: It looks like an utter waste of people's time and effort... until they succeed.

I wish folks trying to formalize prompt generation godspeed.


I wouldn't call it wasted effort (neither would I call space exploration and lunar or martian colonization as wasted effort) - it's not the kind of bullshit "prompt engineering" that tries to find magic phrases and sentence structures, and gets invalidated every time the model is updated. If I understood the article correctly, this language implements its operators by stepping the language model directly, applying the logical operations to token probability distributions. This actually makes sense as a generic and reliable method for these kinds of models, as it works below the prompt level, directly on the underlying machinery.


We agree! But now I understand the work itself better thanks to your summary, so thank you.


This is really interesting. A friend of mine had a similar idea for an intermediate language between natural language and SQL that he’d let people build apps on top of. This was about 10 years ago so I thought it was a ridiculous notion but time proved me wrong.


Next step is training an AI to generate prompts.


I thought that would be a language for sampling from the LM. Why no one created one btw?


Looks like spintax




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: