Loading...

Lecture 24: Introduction to Diagonalization (Nicholson Section 3.3/Section 5.5)

Alternate Video Access via MyMedia | Video Duration: 50:11
Description: Note: in this lecture I use the language of “linear independence”.  You may not have seen this yet. For two vectors, it means that they’re not parallel. Hold onto that concept and you’ll learn more about linear independence later on. Started by reviewing definition of eigenvalue and eigenectors.
3:50 --- Reviewed procedure for finding eigenvalues and eigenvectors of A.
8:00 --- Given a 3x3 matrix w/ eigenvalues 1,2,2 what are the benefits? (Only need to solve two linear systems when hunting eigenvectors.) What are the risks? (You might not be able to find two fundamentally different eigenvectors when you’re looking for eigenvectors w/ eigenvalue 2.)
11:00 --- Considered a 3x3 matrix A and built a matrix P out of three eigenvectors of A. Computed AP, using block multiplication. The result is that AP is a matrix where each column is a multiple of the corresponding column of P. What is the multiple? The eigenvalue. The result is the P diag( labmda1, lambda2,lambda3) where lambda1, lambda2, and lambda3 are eigenvalues of A. Make sure you understand this portion of the lecture very well --- all of diagonalization is built on AP = P diag( …).
20:00 --- Introduced the concept of diagonalization. It's critical that the matrix P be invertible.
23:10 --- What happens if I change the order of the columns in P? The new matrix will still be invertible. What happens to AP? What’s the result on the diagonal matrix that I ultimately get.
27:00 --- What happens if I replace one of the columns of P with a nonzero multiple of that column?
32:20 --- What happens if I replace one of the columns of P with a nonzero multiple of one of the other columns?
36:30 --- Defined what it means for a square matrix to be diagonalizable. Note: my definition (that A = P diag(…) inv(P) ) is different from the one in Nicholson (that inv(P) A P is a diagonal matrix) but the two definitions are equivalent if you left- and right-multiply by the appropriate matrices.
 37:40 --- Stated a theorem that if A is diagonalizable then the columns of P are eigenvectors and the entries in the diagonal of the diagonal matrix are eigenvalues.  
42:45 --- Given a specific 3x3 matrix, is it diagonalizable? Did the long division on the characteristic polynomial, in case you’re wanting to see that. If you don’t want to spend so much ink/lead then learn how to do synthetic division; this is short-hand for long division.