Hacker News new | past | comments | ask | show | jobs | submit login

remember when ensembles were the cool word before they got erased from collective consciousness and replaced with deep things? it can't even be a decade, was it 2012 or something?



They haven't got erased, but more like subsumed? If you use dropout to train your model that is basically equivalent with using an ensemble of deep neural networks.


That is not even close to the same thing.

If you train an ensemble of models with random dropout, you have an ensemble. Models trained with dropout will still have significant variation from run to run.


> That is not even close to the same thing.

It's a common interpretation: https://arxiv.org/abs/1706.06859


There may be a paper on it, but it’s not a common view.

In particular, this paper neglected to do the obvious thing: ensemble networks trained with dropout. It improves performance over dropout alone.


Why shouldn't you employ an ensemble of deep neural networks?


Correlated errors. Naive averaging will lead to overconfidence and it is not trivial to model the correlation. Boosting is worth a shot though.


My point was that ensembles of deep neural networks are commonly used and yield higher accuracies.


More importantly, what happens if you put a radiologist opinion (or multiple) in such an ensemble?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: