Hacker News new | past | comments | ask | show | jobs | submit login

In ML: when your model is not defined by a fixed set of parameters, but the number of "parameters" varies depending on the training data. For example k nearest neighbor classification requires storing the entire training set in order to be able to make predictions. Gaussian process regression and Dirichlet process based clustering (mixture fitting) are other examples. Linear regression on the other hand is parametric as the model is defined by a fixed set of coefficients whose count does not depend on the number of training examples/observations.



this is how I understand it. but then I’ve heard statisticians describe neural networks as “nonparametric”, even though they typically have a fixed number of parameters. (millions of parameters! arguably they are the MOST parametric.)


Neural Network in general is indeed nonparametric because the number of weights are not something that is fixed in advance but learned from data. If they are considered fixed, for example for logistic regression then its considered parametric.


The number of parameters in aneural net as used today, specifically in computer vision, is basically never learned from training data. I actually cannot recall practically used methods that would do that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: