Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Conditional expectations are an important part of regression and in other scenarios where you might want to adjust a parameter estimate ("for every unit increase in x we get this much of a difference in y") for confounders. Generally, in machine learning, parameter estimates are not the (exclusive) basis for prediction, instead you put data in and a prediction comes out and what's in between is somewhat of a black box.


Oh cool, thanks for the explanation! And I take it what's in the black box is the key to figuring out how to make AI truly alive?


Well, technically we do know what's in the black box of course, it's just that for many methods it's not easy to summarize because there's so much happening under the hood. Leo Breiman (who invented random forests) gives some examples of how to do it, though: https://projecteuclid.org/euclid.ss/1009213726




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: