Hacker News new | past | comments | ask | show | jobs | submit login

Honestly, I think Strang is overrated. Yeah, I know, on HN that's like criticizing Lisp or advocating homebrew cryptography or disagreeing that trains fix everything. But still.

I bought his 6th ed. Introduction to Linear Algebra textbook, and he doesn't get more than two pages into the preface before digressing into an unjustified ramble about something called "column spaces" that appears in no other reference I've seen. (And no, boldfacing every second phrase in a math book just clutters the text, it doesn't justify or explain anything.) Leafing through the first few chapters, it doesn't seem to get any better.

The lecture notes by Terence Tao that someone else mentioned look excellent, in comparison.




I definitely covered the column space and row space in my undergrad LA class, long before I had ever heard of Strang.

An exceptional minority of people has the ability to learn linear algebra in its full abstract generality as their first treatment of the material, and come away with something resembling an understanding.

The rest of us dopey oafs must develop intuition carefully from specific concrete examples that extend gradually from algebra and geometry that we are familiar with already. Those of us in this sad deficient category must be led painstakingly over several weeks of course material to even the basic idea that a matrix is just a particular representation and special case of something called a linear transformation.

If you are one of the former type, you are blessed, but it's unfair to sneer at the latter, and it will only do your students a disservice.


In my experience, it's a little bit easier for new students to understand that the image of a matrix is the span of its columns, hence column space.


Perhaps, but that's about as useful as pointing out that monads are a monoid in the category of endofunctors. What's the "image of a matrix?" Coming at LA from a 3D graphics background, I've never heard that term before. And what does the "span of its columns" mean?

To me, each column represents a different dimension of the basis vector space, so the notion that X, Y, and Z might form independent "column spaces" of their own is unintuitive at best.

These are all questions that can be Googled, of course, but in the context of a coherent, progressive pedagogical approach, they shouldn't need to be asked. And they certainly don't belong in the first chapter of any introductory linear algebra text, much less the preface.


> To me, each column represents a different dimension of the basis vector space, so the notion that X, Y, and Z might form independent "column spaces" of their own is unintuitive at best.

I can't help but feel a treatment of linear algebra that assumes all matrices are invertible by default isn't a very good treatment at all. Column spaces are exactly how you harness your (very useful!) intuition that the columns of a matrix are where the basis goes. I agree that it should be defined before use, but it is- in the textbook proper. The preface is for the author to express themselves!

Now row spaces are an abomination, but that's because I'm not really a computation guy. I'm sure they're great if you get to know them.


In the context of linear algebra, a matrix is a linear map. A map is characterized by its domain and its image. These are very important characteristics.


His lectures are great but I definitely agree about the book. It reads like one of the TAs transcribed the lectures and added some exercises to the end.


I can guarantee you chatgpt can explain column and row spaces to you, suggesting that it is part of the common lexicon in linear algebra.


I'm not saying they don't belong in the book. I'm saying, evidently poorly, that they don't belong in the first chapter.

Strang has a bit of a "Next, draw the rest of the fucking owl" vibe going on. It wasn't what I'd been led to expect from the reviews.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: