When I was taking machine learning courses and reading machine learning textbooks a few years ago, I have fond recollections of the derivations from Tom Mitchell's textbook.
Where other textbooks tended to jump two or three steps ahead with a comment about the steps being "obvious" or "trivial", Mitchell would just include each little step.
Yes, you could argue it was my responsibility to remember all of my calculus and linear algebra. But it is kind to the reader to spell out the little steps, for those of us who maybe forgot some of our calculus tricks, or maybe don't even have all of the expected pre-requisites but are trying to press on anyway. Or actually know how to perform the steps but have to stop and puzzle through which particular combination of steps you are describing as "obvious" in this particular instance.
I just remember how nice it was to have those extra steps spelled out, and how much more pleasant it made reading Tom's book.
http://www.cs.cmu.edu/~tom/mlbook.html
Where other textbooks tended to jump two or three steps ahead with a comment about the steps being "obvious" or "trivial", Mitchell would just include each little step.
Yes, you could argue it was my responsibility to remember all of my calculus and linear algebra. But it is kind to the reader to spell out the little steps, for those of us who maybe forgot some of our calculus tricks, or maybe don't even have all of the expected pre-requisites but are trying to press on anyway. Or actually know how to perform the steps but have to stop and puzzle through which particular combination of steps you are describing as "obvious" in this particular instance.
I just remember how nice it was to have those extra steps spelled out, and how much more pleasant it made reading Tom's book.
So thanks, Dr. Mitchell!