Friday, September 26, 2008

Diagonalizable Matrices

Euclidean (real) space
Unitary (complex) space
Real symmetric: AT= A
Hermitian: A*= A
Orthogonal: AT = A-1
Unitary: A* = A-1
Skew-symmetric: AT = -A
Skew-Hermitian: A* = -A
Real normal: AAT = ATA
Normal: AA* = A*A


Real Symmetric Matrices

Let A be any real symmetric matrix. Then there exists a real nonsingular matrix P such that P-1AP is a diagonal matrix. This implies that all of the eigenvalues of A are real. This has an analogue for complex matrices. It is called the spectral theorem for complex matrices.


A matrix P is said to be orthogonal if and only if P is nonsingular and P-1= PT; that is if and only if, PTP=I. An nxn matrix A is said to be orthogonally diagonalizable if it is possible to find an orthogonal matrix P such that  PTAP = P-1AP = D, a diagonal matrix.


Matrix is not orthogonal matrix, even though its column vectors are mutually orthogonal. However, it can be transformed into orthogonal matrix by normalizing each column vector, that is, by dividing each column vector by its norm (=length). Since these were bases for eigenspace, that merely replaces each basis vector with a "unit" basis vector. 


Any nxn real symmetric matrix A will be diagonalizable; in fact, such a matrix A is orthogonally diagaonlizable.  For a real symmetric matrix, the algebraic multiplicity of an eigenvalue is equal to its geometric multiplicity. Note: the number of linearly independent eigenvectors of A associated with the eigenvalue λi is the dimension of the eigenspace associated with  λi and is often called the geometric multiplicity.


Complex Matrices

A unitary space is a complex vector space with a complex inner product. It is defined as, for complex vectors u and v, the inner product is the complex number <u, v> = u*v

It has the following properties:

  1. <u, v> = <v, u>

  2. <u, av+bw> = a <u, v> + b <v, w>, for all complex scalars a and b

  3. <u, v> >= 0; with equality iff, u = 0
Two vectors u and v in a complex inner-product space V are orthogonal if and only if, <u, v> = 0. Other concepts for real inner-product spaces also carry over to the complex inner-product spaces, i.e. the norm ||u|| and the Cauchy-Schwarz inequality (| <u, v>|2 <= <u, u> <v, v>), etc.
 

A unitary matrix and a unitary space are different. A unitary matrix is a special type of matrix in which the conjugate transpose of the matrix is its inverse. Since every matrix defines a linear transformation, and every linear operator T: V --> V on a finite-dimensional unitary space V has a matrix representation, linear transformation are called by the same names as are the matrices. 


Every Hermitian matrix A is normal and every unitary matrix U is also normal. 


The eigenvalues of an Hermitian matrix are all real numbers; furthermore, eigenvectors corresponding to distinct eigenvalues are mutually orthogonal. 

If A is an nxn Hermitian matrix with positive eigenvalues, then the function <u, v> = u*Av is an inner product on Cn.


Schur's Theorem

For every nxn complex matrix A, there exists a unitary matrix U such that M =  U*AU is upper triangular. Furthermore, the diagonal entries of M are the eigenvalues of A. Unitary matrices are the complex analogues of orthogonal matrices in the real case. If the given complex matrix A is normal, then it is always the case that the upper triangular matrix M is a diagonal matrix, i.e., there exists a unitary matrix U such that  U*AU is a diagonal matrix. 


Spectral Theorem

An nxn matrix A is normal if and only if it has a complete set of orthonormal eigenvectors. 

Matrix Similarity

Similarity to a Diagonal Matrix

Let U be an n-dimensional vector space with a given basis G = {u1, u2 , ... , un}. Also, let V be an m-dimensional vector space with a given basis H= {v1, v2, ... ,vm}. And let T: U --> V be a given linear transformation from U into V. Every vector x in U has a unique representation as a linear combination of the basis vectors in G, i.e.,

  x = x1u1 + x2u2 + ... + xnun


Since T is a linear transformation, we have

  T(x) = T(x1u1 + x2u2 + ... + xnun) = x1T(u1) + x2T(u2) + ... + xnT(un)


Because H is a basis for V, each of the vectors T(ui) has a unique representation as a linear combination of the basis vectors H for V.

  T(ui) = a1iv1 + a2iv2 + ... + amivm 


Define the matrix representation of the linear transformation T: U --> V with respect to the bases G for U and H for V to be the matrix 

  A = [ [T(u1)]H : [T(u2)]H : ... : [T(un)]H ]

  A =  [ T ]GH

Transformation T(x) = [ T ]GH [x]G = A[x]G 

Let T:U --> U be a linear operator on the finite-dimensional vector space U. Let U have bases G and  H. Let A = [T]G be the matrix representation of T relative to the G-basis and let B = [T]H be the matrix representation of T relative to the H-basis. Then there exist a nonsingular matrix S such that A = SBS-1


The matrix S is the change of basis matrix from the basis H to the basis G. It is the matrix representation of the identity linear operation I(u) = u on U, i.e. S = [ I ]HG . Since S represents the identity transformation, it is clearly invertible. So S-1 = [ I ]GH represents the change of basis from G-basis to H-basis.


The relationship A = SBS-1 between two different representations A and B for the same linear transformation T is used in the similarity of matrix. An nxn matrix A is said to be similar to the nxn matrix B if there exists an nxn nonsingular matrix S so that A = SBS-1. 


Similar matrices have the same eigenvalues. An nxn Matrix A is similar to a diagonal matrix D if and only if A has n linearly independent eigenvectors. Furthermore, in this case, the diagonal elements of D are the eigenvalues of A as well as those of D. Eigenvectors corresponding to different eigenvalues are always linearly independent of each other. 


An nxn matrix A is said to be diagonalizable if there exists an invertible matrix P such that the matrix P-1AP = D is a diagonal matrix. If A is diagonalized, then it must have n linearly independent eigenvectors. Not every nxn matrix A is similar to a diagonal matrix, i.e. not every nxn matrix can be diagonalized. 


Similarity Classes

An equivalence relation on a set of objects (in this case matrices) is any relation that is reflexive, symmetric, and transitive. 

Similarity of matrices has the following important features:
  1. Reflectivity : Every nxn matrix A is similar to itself.

  2. Symmetry : If A is similar to B, then B is similar to A.

  3. Transitivity : If A is similar to B and B is similar to C, then A is similar to C.
All matrices similar to a given nxn matrix A belong to a class called similarity class. Every nxn matrix belongs to exactly one similarity class, and no two distinct similarity classes have any elements (i.e., matrices) in common.

If K = λI is a scalar matrix, then the only matrix similar to K is K itself. Other matrices have many matrices in their individual similarity class. Each linear transformation T gives to a similarity class of matrix representation of T. Real symmetric matrices and their complex counterparts, Hermitian matrices are always diagonalizable. Similar matrices always have the same eigenvalues, and corresponding eigenspaces (but not conversely).  The method for finding the "simplest" member of the similarity class containing a given matrix is the "canonical representative" of the associated transformation (connection between eigenvalues and a diagonal "canonical similarity form" for a given matrix).