Determinants – Part 1

Hello everyone! Welcome to, and thank you for reading, my first non-test post, i.e. my first post about some actual mathematics! Perhaps because my teaching assignment this semester is a homework help session for introductory linear algebra courses (and perhaps because this is a fairly easy first topic for me to write on), I have decided to write about a very useful tool in matrix theory: the determinant. In my experience, introductory courses in linear algebra tend to skip over many of the details involved with defining the determinant and present it as a black box for checking if a matrix A is invertible. Hopefully I can shed some light on this box for those who have never seen it before, and for those who have perhaps present a nice refresher on basic linear algebra.

As is usually the case in mathematics, one needs to make many preliminary definitions and prove some initial results before developing more complex ideas, and this is certainly the case for the determinant. Since I do not intend to build up the theory of vector spaces from the ground up, I’ll assume the reader is familiar with the definition of a vector space, as well as the ideas of bases, dimension and linear independence.

I want to write down one definition that is most likely familiar to the reader, but just in case it is not, here is the formal idea of a product of vector spaces. Let U and V be two vector spaces (over the same field F). Then define the (external) direct sum of U and V to be the vector space

W=\{(u,v):u\in U,v\in V\},

together with the operations

(u_1,v_1)+_W(u_2,v_2)=(u_1+_Uu_2,v_1+_Vv_2) and \alpha\cdot_W(u,v)=(\alpha\cdot_U u,\alpha\cdot_V v),

where \alpha\in F. It is easy to see that W is a vector space over F. Notationally, we usually drop the subscripts on the operations (they are clear from context), and often write W=U\oplus V. There is much more that could be said about the direct sum, but for our purposes, we will really only need to know that this space is in fact a vector space with the above operations.

Okay, now that those preliminaries are out the way, let’s get to a definition that may not be as familiar. Let V_1,V_2,\dots, V_k be vector spaces over the same field F. We say a function (\cdot,\cdot,\dots,\cdot):\oplus_{i=1}^k V_i\mapsto F is a k-linear form if for all 1\leq i\leq k the identity

(v_1,\dots, v_i+\alpha x,\dots,v_k)=(v_1,\dots,v_i,\dots v_k)+\alpha(v_1,\dots, x,\dots, v_k)

holds for all v_i,x\in V_i (so v_i and x are arbitrary vectors). In other words, if we hold any k-1 of the variables constant, the function is linear in the remaining slot. For example, the dot product \langle x,y\rangle=\sum_{i=1}^n x_iy_i on \mathbb{R}^n is a 2-linear (bilinear) form on \mathbb{R}^n=\mathbb{R}\oplus\mathbb{R}\oplus\dots\oplus\mathbb{R}. Like products of vector spaces, there is much that could be said about the general theory of k-linear forms, but we will concern ourselves with a special type of k-linear form. A k-linear form \omega is an alternating k-linear form if \omega(v_1,v_2,\dots, v_k)=0 whenever v_i=v_j for some i,j\in[k].

At first glance, this may seem to be a strange definition to make. Why should we concern ourselves with such functions when we are trying to compute determinants of matrices? Well, remember that the determinant of a matrix is zero if any two of the rows are the same. While I will actually approach the determinants from the perspective of linear transformations (and then apply them to matrices), this gives a little insight into where this discussion is going. To close this post, I’ll prove one result about alternating k-linear forms to give you the flavor of what is to come in the next few posts.

Proposition: If \omega is an alternating n-linear form on \bigoplus_{i=1}^n V, where V is an n-dimensional vector space and \{x_i\}_{i=1}^n are linearly dependent vectors, then \omega(x_1,\dots,x_n)=0.

Remark: In the case above, where \omega is defined on the product of n copies of V, an n-dimensional vector space, we often say that \omega is an alternating n-linear form on V for sake of notation. This property of an alternating linear form certainly holds for determinants of matrices, since we know det(A)=0 if and only if the rows of A are linearly dependent.

Proof: If any of the x_i are zero, the result is trivial, so suppose they are all non-zero. Because the set is linearly dependent, there is 1\leq i\leq n so that x_i is in the span of the remaining n-1 vectors. Without loss of generality, we may assume i=1 (by simply re-indexing the finite list of vectors), so say v_1=\sum_{i=2}^n c_ix_i. But then by holding the second through n’th slots constant, by the linearity of \omega we have

\displaystyle \omega(x_1,x_2,\dots,x_n)=\omega\left(\sum_{i=2}^n c_ix_i,x_2,\dots,x_n\right)=\sum_{i=2}^n\big(c_i\cdot\omega(x_i,x_2,\dots,x_n)\big)

But because \omega is alternating, each term in the sum on the right is zero, so the whole sum is zero, which completes the proof.

This proof, in light of the definition of an alternating n-linear form, was quite simple. Indeed the definition of alternating n-linear form is so strong that it’s not even clear such a function exists. We will see that, in fact, there is such a function for every n, and it is unique up to multiplication by constant factors. It will take some work, but we will also show that the usual matrix determinant is such an alternating n-linear form and hence is unique up to re-normalization. Admittedly, this post was somewhat dry, but the definitions we made here are necessary for a proper treatment of determinants. Bear with me and I assure you the next few posts will be more interesting!

Advertisements

About Ryan

I'm a software developer at Hudl where I work on awesome software. Before that, I was a grad student in mathematics, interested in probability theory as well as analysis, more on the side of functional analysis and less on the side of PDEs. Apart from that I'm pretty lame. Though I do enjoy watching football, playing golf, and playing the trumpet.
This entry was posted in Linear Algebra. Bookmark the permalink.

3 Responses to Determinants – Part 1

  1. Adam Azzam says:

    Sorry for commenting almost a month after this post. I thought the definition of an alternating multilinear form was a multilinear form whose sign changes when you transpose two of its arguments. The definition you gave follows from this definition, given that if two arguments are equal, then transposition of those arguments changes the sign but not the value, and so it must evaluate to zero. I don’t imagine that that the definition you provided implies the other definition, which might be preferable since this reflects the property exhibited by the determinant when you switch two rows, two columns, etc.

  2. Ryan says:

    So, perhaps since you ask I should prove this then, but this fact follows from the facts I wrote down in post two about permutations multiplied by n-linear forms. “Switching two rows,” is essentially the use of the n-linear form (j,k)\omega, which as I note in the next post is sgn(j,k)\omega=-\omega. I avoided the proof of this fact in the post because it’s mostly just a computation that doesn’t give much insight. Perhaps a sketch is in order: If you have an alternating n-linear form \omega and vectors x_1,\dots, x_n, let’s say you want to switch x_1 and x_2 (for notation of course! The proof easily generalizes). Then 0=\omega(x_1+x_2,x_1+x_2,\dots,x_n)=\omega(x_1,x_1,\dots,x_n)+\omega(x_1,x_2,\dots,x_n)+\omega(x_2,x_1,\dots,x_n)+\omega(x_2,x_2,\dots,x_n). Where I used linearity and the alternating property of \omega. But again since \omega alternating, the terms on the right with (x_1,x_1),(x_2,x_2) are zero, so we have after rearranging \omega(x_1,x_2,\dots,x_n)=-\omega(x_2,x_1,\dots,x_n). Of course then you can generalize this to any product of transpositions (i.e. any odd permutation) by induction.

  3. Pingback: Basic AC, Part II: Choice in Algebra | whateversuitsyourboat

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s