Hacker News new | past | comments | ask | show | jobs | submit login

Large language model are the rage, it seems.

Can someone a successful (not necessarily profitable) concrete application of these things, other than "gpt-3 wrote an article in Guardian and said it wouldn't kill us".




I used largish (GPT-2 and similar) models to build an app discovering Category Entry Points (a marketing thing around the things people are thinking about when they decide they need to buy a particular product) for specific product categories.

It was very successful.


We use specifically prompted gpt-3 to generate synthetic training examples (eg paraphrases, summaries, etc). We fine tune other (much smaller than gpt3 but still large-ish) language models for controllable language generation (often augmented with synthetic data from gpt3). As a comparison point, we did try GPT Neo and it did not provide sufficiently high quality synthetic data.

Transformers in general have lots of applications (machine translation, information retrieval/reranking, ner, etc).


In the narrow access window I was allowed to Philosopher AI, I found it incredibly helpful in brainstorming and bouncing ideas off. It helped me organize my project, and I even included the conversations in the repo.


AI Dungeon?


GitHub Copilot?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: