Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Minsky and Papert showed that single layer perceptrons suffer from exponentially bad scaling to reach a certain accuracy for certain problems.

Multi-layer substantially changes the scaling.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: