Hacker News new | past | comments | ask | show | jobs | submit login

FWIW, llama.cpp has always had a JSON schema -> GBNF converter, although it launched as a companion script. Now I think it's more integrated in the CLI and server.

But yeah I mean, GBNF or other structured output solutions would of course allow you to supply formats other than JSON schema. It sounds conceivable though that OpenAI could expose the grammars directly in the future, though.




I think for certain tasks it's still easier to write the grammar directly. Does converting from JSON to a CFG limit the capabilities of the grammar? i.e., are there things JSON can't represent that a context free grammar can?


You might be right that they're similarly powerful. In some cases, an arbitrary output format might in and of itself be desirable. Like it might result in token savings or be more natural for the LLM. For instance, generating code snippets to an API or plain text with constraints.

And this is more esoteric, but technically in the case of JSON I suppose you could embed a grammar inside a JSON string, which I'm not sure JSON schema can express.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: