Let n≥1. Assume that A is a real n×n matrix which satisfies the equality
A7+A5+A3+A-I=0
Show that det(A)>0
Note: all matrices are assumed to be square matrices in this exercise.
I'm going to prove the following: For a polynomial P(x) and matrix A, if P(A)=0 then the eigenvalues of A must be among the roots of P(x).
First thing is a lemma: If A and B are similar matrices and P(A)=0 then P(B)=0.
By the definition of similar matrices there must be some invertible matrix M such that M*B*M^-1 = A.
Let P(x) = c_n*x^n + c_(n-1)*x^(n-1) + ... + c_1*x + c_0. We are given P(A)=0, then c_n*A^n + c_(n-1)*A^(n-1) + ... + c_1*A + c_0*I.
Substitute to get c_n*(M*B*M^-1)^n + c_(n-1)*(M*B*M^-1)^(n-1) + ... + c_1*(M*B*M^-1) + c_0*I
= c_n*M*B^n*M^-1 + c_(n-1)*M*B^(n-1)*M^-1 + ... + c_1*M*B*M^-1 + c_0*M*I*M^-1
= M*[c_n*B^n + c_(n-1)*B^(n-1) + ... + c_1*B + c_0*I]*M^-1
= M*P(B)*M^-1.
This must equal 0, but M and M^-1 cannot be the zero matrix, therefore P(B) must be zero. Lemma proved.
Similar matrices have the same eigenvalues. And every matrix is similar to either its Diagonal matrix (if it is diagonalizable) or its Jordan matrix (if not diagonalizable). I will assume the Jordan matrix is the upper triangular version.
Both of these matrices have all their eigenvalues on the main diagonal. And to generalize diagonal and Jordan matrices are upper triangular matrices which also have all their eigenvalues on the main diagonal.
So then let's call our diagonal/Jordan matrix D. Then P(D)=0. Now I want to focus on the elements on the main diagonal. Evaluating P will consist of evaluating sums, scalar products and matrix products of upper triangular matrices.
Let f(j,j), g(j,j) and h(j,j) be corresponding j-th entries on the main diagonals of matrices F, G and H. Let z be a scalar.
If F+G=H then f(j,j) + g(j,j) = h(j,j) follows from basic matrix addition.
If z*F=H then z*f(j,j) = h(j,j) follows from basic matrix scalar multiplication.
If F*G=H then f(j,j) * g(j,j) = h(j,j). This is not obvious, so lets look at the j-th row vector of F and the j-th column vector of G. Elements 1 to j-1 of the row vector are zero, and elements j+1 to n of the column vector are zero. This means the inner product of these two vectors has only one non-zero term: f(j,j)*g(j,j). But h(j,j) is merely that inner product, therefore f(j,j) * g(j,j) = h(j,j).
With that established then we can conclude that the elements on the main diagonal of P(D) depend solely on the elements on the main diagonals of D and the scalars from the coefficients of P(x). Then by composition of addition, scalar multiplication and matrix multiplication the element P(D)(j,j) will be function of d(j,j), specifically P(D)(j,j)=P(d(j,j)).
But P(D)=0, so all its diagonal entries are zero, then P(d(j,j)) = 0 which means a(j,j) is a root of P(x). Also a(j,j) is an eigenvalue of A, and can be iterated for j=1 to n. Then all the eigenvalues of D are among the roots of P(x).
But then D is similar to A, which means all the eigenvalues of A are among the roots of P(x). QED.
Edited on August 12, 2023, 12:28 am