Hacker News new | past | comments | ask | show | jobs | submit login

Thanks so much for this. I have no interest in deep learning (at the moment) but I was working through some papers about the Lucas Kanade tracker and this paper explains some of the underlying math in just the right amount of detail. The authors usually show the beginning and end point and just say something like "using the chain rule" we arrive at ... It took me a while to understand what they were saying and this paper helps a lot.

The math is super easy but keeping all the notation s and conventions in my head is hard, I've never seen it laid out this nicely before. Thanks!




Hiya. That's funny because it's exactly what caused us to write this article. Jeremy and I were working on an automatic differentiation tool and couldn't find any description of the appropriate matrix calculus that explained the steps. Everything is just providing the solution without the intervening steps. We decided to write it down so we never have to figure out the notation again. haha




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: