Hacker News new | past | comments | ask | show | jobs | submit login

I am getting old: I read the description two times and checked examples yet still don't understand the utility. I do understand Midjourney prompt engineering though.



LLM n00b here.

My 2c - Prompts are the input that you send to LLMs to get them to give you output. In general LLMs are large black boxes, and the output you get is not always great. The output can often be significantly improved by changing the input. Changing the input usually involves adding a ton of context - preambles, examples, etc.

A lot of the work of prompt rewriting is like boilerplate generation. It is very reusable so it makes sense to write code to generate prompts. Prompt Engine is basically a way of making that prompt rewriting work reusable.

Code Engine seems to be a way of rewriting prompts for LLMs that generate code in response to text prompts

Chat Engine is the same for LLMs that generate chat/conversational responses.


Midjourney does not have contextual memory, but it does have a feature to always add a given suffix to any prompt. I guess this is a more powerful variant of the same sort of concept. I wonder who will "win" - specialised models or a single configurable one...


> I read the description two times and checked examples yet still don’t understand the utility.

It’s a tool for (among other things) building the part of a ChatGPT-like interface that sits between the user and an actual LLM, managing the initial prompt, conversation history, etc.

While the LLM itself is quite important, a lot of the special sauce of an AI agent is going to be on the level that this aims to support, not the LLM itself. (And I suspect a lot of the utility of LLMs will come from doing something at this level other than a typical “chat” interface.)


Ah, sounds super-niche.


I think it won’t bein a few years; a whole of the interesting bits of putting LLMs to use is going to be at the level this supports, not building new models.


I already feel upset about my future interactions with any online support... But as an average "dark matter" enterprise dev, not sure how useful will it be.


As the background explains, you can tell LLMs how they should behave in an interaction session.

The examples first configure the LLM, either by simply using a sentence which tells it what you expect from it (example 1: "answers in less than twenty words"), pass examples to it, and then continue a normal interaction session.

You could use this prompt-engine to set up your own chat server, where this would be the middleware.


That sounds useful actually. So I could e.g. set up a Harry Potter chat server and make the bot respond only as Dumbledore or only use concepts of that setting? Or a chat server that responds to algorithmic tasks only with Python 3 code using exclusively numpy package?


Ah, my dear child. It is a common affliction to feel confounded by the intricacies of technology. Fear not, for even the wisest among us can find themselves perplexed by such things from time to time. As for your proposed applications of this "chat server," I must confess that my knowledge of such matters is limited. However, it does sound like a most intriguing endeavor. Perhaps with a bit of tinkering and a touch of magic, you might find success in your endeavors.

I used ChatGPT. The prompt was:

I want you to act like Dumbledore from Harry Potter. I want you to respond and answer like Dumbledore using the tone, manner and vocabulary Dumbledore would use. Do not write any explanations. Only answer like Dumbledore. You must know all of the knowledge of Dumbledore. My first sentence is "Hi Dumbledore. I am getting old: I read the description two times and checked examples yet still don't understand the utility. I do understand Midjourney prompt engineering though.

That sounds useful actually. So I could e.g. set up a Harry Potter chat server and make the bot respond only as Dumbledore or only use concepts of that setting? Or a chat server that responds to algorithmic tasks only with Python 3 code using exclusively numpy package?"

The prompt is based on one from https://github.com/f/awesome-chatgpt-prompts


Prompts are a way to interact with the user, and embedding that in a program can be complex. PromptEngine is basically a Prompt with a bunch of bells and whistles baked in.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: