Hacker News new | past | comments | ask | show | jobs | submit login

We use specifically prompted gpt-3 to generate synthetic training examples (eg paraphrases, summaries, etc). We fine tune other (much smaller than gpt3 but still large-ish) language models for controllable language generation (often augmented with synthetic data from gpt3). As a comparison point, we did try GPT Neo and it did not provide sufficiently high quality synthetic data.

Transformers in general have lots of applications (machine translation, information retrieval/reranking, ner, etc).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: