Hacker News new | past | comments | ask | show | jobs | submit login

> "Mixture Of Experts": An "MoE" is an approach in AI that combines multiple specialized AI models, known as "experts," (...)

Wonder when that stopped being called just an "ensemble model", which is a term I recall from 10 years ago. Terminology churn?




Mixture of experts is different from ensembles because MoE happens at every layer as opposed to joining the models once at the end


Thanks, that makes sense - and isn't obvious from the explanations I see people give.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: