Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I remember reading something very similar to this (but can't remember where, ha), where it said the important thing about reading is how if affects your general thinking rather than the individual pieces of information that you're likely to remember (or not).

I spent several years reading a ton of different books on economics and I can recall very few facts from those books, but it did and has completely altered my world view of many things.

pg's analogy of a program where you've lost the source code doesn't feel quite right, because you can't make modifications to the program without the code. Some sort of machine learning model seems more appropriate, where you've lost the original training data but can still update the model later with fresh data (a new book), and end up with a better/different model, but then lose that training data again.



I think a machine learning model provides a nice version of Graham's "The same book would get compiled differently at different points in your life."

Using an artificial neural net analogy instead of a compilation analogy: "The same book would optimize your neural net towards a different local minimum at different points in your life."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: