Hacker News new | past | comments | ask | show | jobs | submit login

> I started tinygrad in Oct 2020. It started as a toy project to teach me about neural networks

Shows you what is possible in 2.5 years. Keeps me motivated to learn.




The math isn't super difficult. Some books will try to throw a mess of differential equations at you, but some simple calculus is all you need for backpropagation.


I have been through the math thanks to the youtube videos by A. Karpathy. Deriving some of the differentials, e.g. for batchnorm seems fairly hard (hard as in slogging through something with many steps where you can't make a mistake at any step). But the principles are quite simple - I think by design. If they were hard to compute or reason about then the neural net wouldn't work very well!


Doing the compute efficiently, especially from Python, is the tricky part.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: