Extracting eigenvectors from a matrix

jeudi 2 janvier 2014

Hello,



1. The problem statement, all variables and given/known data

I want to show that a real symmetric matrix will have real eigenvalues and orthogonal eigenvectors.



$$

\begin{pmatrix}

A & H\\

H & B

\end{pmatrix}

$$



3. The attempt at a solution

For the matrix shown above it's clear that the charateristic equation will be

##\lambda^2-\lambda(A+B)+AB-H^2=0##



I can show that the discriminant of the quadratic equation will be greater than 0 implying that the eigenvalues must be real.

##b^2-4ac=(A+B)^2-4(AB-H^2)=A^2+2AB+B^2-4AB+4H^2##

##=(A-B)^2+4H^2##

Since ##A, B, H \in \mathbb{R}##, ##(A-B)^2+4H^2 \geq 0##



Knowing that ##\lambda## must be real for this matrix.



My only problem now is to show that the eigenvector is orthogonal.



The matrix has eigenvalues of ##\lambda_1, \lambda_2##, and hence eigenvectors ##\lambda_1v_1, \lambda_2v_2##.



How can I show,



##\lambda_1\lambda_2x_1x_2+\lambda_1\lambda_2x_1x_2=0##?



I know ##\lambda_1\lambda_2=det(M)##



It could become,



##det(M)(x_1x_2+y_1y_2)=0##



Then it's clear the vectors are orthogonal because ##det(M)## cannot be 0. But the problem is this is not a proof because I explicitly assume the dot product are 0 in the first place..



I tried substituting the complete quadratic equation into the matrix as if I know the lambda but then the matrix cannot be eliminated in simple manner and I got a mess real quick.





0 commentaires:

Enregistrer un commentaire