Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> AFAIK, distillation typically refers to tuning on the logits of the larger model

I think this is called “logit distillation” which is a particular form of distillation but not the only one.

> so you wouldn't be able to do that with fine-tuning APIs (OpenAI + Google in our blog post)

Dististillation from competitors' API is so common it has been given a name: it's called “distealing”.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: