Hacker News new | past | comments | ask | show | jobs | submit login

Yes they are intuitive but it is pretty hard to define them rigorously in a way that captures that intuition. Also, every way I've seen to define them requires choosing a basis (or even an orthonormal basis), which isn't that nice. If you read the pdf you'd notice that Axler's proofs completely avoids relying on the fact that every finite-dimensional vector space has a basis, or choosing one.



Defining the determinant definitely does not require choosing a basis (though I don't know how you'd define it without at least knowing that a basis exists, for reasons you can see below).

Let's say you've got a linear transformation T:V->V, and suppose V is n-dimensional. Consider exterior powers of V, Λ^k(V); for each one we naturally get a linear transformation Λ^k(T):Λ^k(V)->Λ^k(V). In particular take k=n, so we get a linear transformation Λ^n(T):Λ^n(V)->Λ^n(V). But Λ^n(V) is one-dimensional, so Λ^n(T) must be multiplication by some constant factor. That factor is the determinant.

That's basically the most "natural" definition of the determinant (notice how multiplicativity immediately falls out of it, assuming of course you already know that Λ^k is a functor). You need the idea of a basis in order to (a) make sense of the statement "V is n-dimensional" and (b) prove that Λ^n(V) is 1-dimensional, but that's it, and you certainly never need to choose one in order to define the determinant.

Probably not a great definition for beginning students of linear algebra, but I do have to correct this idea that defining the determinant requires choosing a basis...


A completely different basis free definition, that works whenever you are working over the complex numbers or any other algebraically-closed field:

The determinant is uniquely determined by the following two axioms:

1. The determinant of the "multiply by lambda" operation on a one-dimensional vector space is lambda. 2. If you have a linear operator T on a vector space V, and a T-invariant subspace W, the determinant of T is the product of the determinant of T restricted to W and the determinant of the operator that T induces on the quotient space V / W.

This actually kinda intuitively meshes with the volume-stretching property: if you are stretching the volume by the factor lambda_1 along one subspace and by the factor lambda_2 along some complementary subspace, clearly the overall stretch factor is lambda_1 * lambda_2.

If you aren't working over an algebraically closed field, you can just tensor with the algebraic closure of whatever field you're working with and take the determinant there. There is also a way of adapting the definition so you don't have to do this, but it makes axiom (1) a bit more complicated.

Another bonus is this definition makes the Cayley-Hamilton theorem completely trivial.

Also, you can give an analogous definition of the trace if you replace multiplication with addition, and of the characteristic polynomial if you replace lambda in axiom 1 with x - lambda.


I think you get a basis of an n-dimensional vector space by the definition of dimension (if e.g. the dimension is the maximum size of a set of linearly independent vectors).

It's a bit trickier when the dimension is infinite, but again most definitions of dimension require there to be a basis of a particular size, the difficult part is proving that this makes sense (i.e. the dimension is unique and defined).


> It's a bit trickier when the dimension is infinite

For those who are wondering, 'every vector space has a basis' is equivalent to the Axiom of Choice.


I mean, in order to define dimension, we have to know that bases exist (and all have the same cardinality). Which means that before I can use the notion of "dimension", I have to know about bases.

I mean I suppose you could just restrict the definition of dimension to vector spaces that have bases, and then you wouldn't have to. I guess that's what you're implicitly suggesting, that dimension would just not apply to vector spaces that don't have bases. That would make sense. But I'm used to thinking of dimension as, well, a function of vector spaces, not a partial function, so as I was thinking of it, you have to prove they all have bases before you can use it!


Yes, it is easy to prove that "a basis exists" (almost by definition depending on how exactly you define dimension) and of course it's legal to use it in proofs, but I think proofs that avoid using this fact directly are more elegant


> almost by definition depending on how exactly you define dimension

What definition of _dimension_ is there that does not rely upon the existence of a base?


You could define the dimension to be the supremum of the cardinality of all linearly independent sets.

In the infinite case this does not trivially give you a basis as 1) the supremum could be strictly larger than the cardinality of all linearly independent sets and 2) adding an extra vector to an infinite linearly independent set doesn't increase it's cardinality, hence there is no reason for the basis to span the entire space.

You could also take the infimum of all sets that span the entire space, but you run into similar problems.


Does "maximum size of a linearly independent set" count? You would have to prove something about spanning sets to get to the existence of a basis


Ah cool, I had not seen this before. I stand corrected.

Are there any books that use this definition? I like this style of linear algebra.


Axler's book.


And how do you define exterior powers without using a basis?


The definition of exterior power doesn't rely on a basis. The k'th exterior power is the quotient of the k'th tensor power by things of the form (v_1 ⊗ v_2 ... ⊗ v_k), where two of the v_i are equal. I'm assuming you know how to define a tensor product without resort to bases. If not, see: https://en.wikipedia.org/wiki/Tensor_product#Definition (it's not the clearest exposition of it, but it'll do)

(Or, if you like, you could define the tensor algebra, take a quotient of that to get the exterior algebra, and then restrict to the image of the k'th tensor power to get the k'th exterior power.)

Again, obviously you need to use bases to prove how to compute the dimension of an exterior power. But you don't need them just to define it.


I thought a bit more about this. Strictly speaking the construction of the tensor product that you link to does use a basis. This basis is the Product of the original two vector spaces. Also, the relations that you quotient by, this is a basis for a subspace. We didn't need a basis for the original spaces, but ended up using bases elsewhere.


You can define tensor products and quotient spaces without bases! (It's a bit of work though)


That's certainly true, but that's also clearly not what was being talked about. The problem was (easier) avoiding choosing bases to perform a construction, and (harder) avoiding assuming all vector spaces have bases. Using bases that you directly contruct isn't something anyone ever really has reason to avoid. (And really, neither is the second in the finite-dimensional case, but hey, may as well if you can, right?)



Right. It's kind of interesting how in all of these definitions the "easy" way involves using a basis. (Although we could argue about what the "easy" way is.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: