Hacker News new | past | comments | ask | show | jobs | submit login

Kevin Murphy - "Machine Learning: A Probabilistic Perspective" is a great (literally - 1000+ pages) textbook that is basically self-contained (pre-reqs: some comfort w. multivariable calculus, linear algebra, basic computer science theory; convex optimization experience a huge plus)



Yikes, quite the tome. Looks great though, I've been looking for something relatively self contained. Does it have exercises for each chapter, and if so are solutions also available?


Every chapter has exercises. One example from Ch. 14 - Kernels is this:

> Exercise 14.2 Linear separability

> (Source: Koller..) Consider fitting an SVM with C > 0 to a dataset that is linearly separable. Is the resulting decision boundary guaranteed to separate the classes?

etc. Many exercises are proofs or derivations, and the book is full of (algorithm/optimization) defining/bounds approximation/ otherwise pragmatic information.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: