Show that the characteristic polynomial is the same as the minimal polynomialWhen are minimal and characteristic polynomials the same?Commutation when minimal and characteristic polynomial agreeMinimal polynomials and characteristic polynomialsCharacteristic polynomial divides minimal polynomial if and only if all eigenspaces are one-dimensional$3 times 3$ matrices completely determined by their characteristic and minimal polynomialsTheorem on characteristic polynomials and minimal polynomials.Minimal Polynomial VS Jordan Normal Form.$textdet(I-AB)=textdet(I-BA)$Find minimal Polynomial of matrixProof: Characteristic polynomial expressed two different ways equals same polynomial.Showing two matrices have the same minimal polynomial
Are there any symmetric cryptosystems based on computational complexity assumptions?
Why are stats in Angband written as 18/** instead of 19, 20...?
Failing students when it might cause them economic ruin
Why aren't satellites disintegrated even though they orbit earth within earth's Roche Limits?
How does this piece of code determine array size without using sizeof( )?
Why do academics prefer Mac/Linux?
Who is frowning in the sentence "Daisy looked at Tom frowning"?
Should I twist DC power and ground wires from a power supply?
Is my homebrew Awakened Bear race balanced?
Why does string strummed with finger sound different from the one strummed with pick?
Show that the characteristic polynomial is the same as the minimal polynomial
Have the writers and actors of GOT responded to its poor reception?
How do we explain the use of a software on a math paper?
Why use a retrograde orbit?
How do you cope with rejection?
Why does the setUID bit work inconsistently?
How to customize the pie chart background in PowerPoint?
Driving a school bus in the USA
When did Britain learn about the American Declaration of Independence?
Why does a table with a defined constant in its index compute 10X slower?
Why is Drogon so much better in battle than Rhaegal and Viserion?
Largest memory peripheral for Sinclair ZX81?
How was the blinking terminal cursor invented?
Lock out of Oracle based on Windows username
Show that the characteristic polynomial is the same as the minimal polynomial
When are minimal and characteristic polynomials the same?Commutation when minimal and characteristic polynomial agreeMinimal polynomials and characteristic polynomialsCharacteristic polynomial divides minimal polynomial if and only if all eigenspaces are one-dimensional$3 times 3$ matrices completely determined by their characteristic and minimal polynomialsTheorem on characteristic polynomials and minimal polynomials.Minimal Polynomial VS Jordan Normal Form.$textdet(I-AB)=textdet(I-BA)$Find minimal Polynomial of matrixProof: Characteristic polynomial expressed two different ways equals same polynomial.Showing two matrices have the same minimal polynomial
$begingroup$
Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
I wish to show that the characteristic and minimal polynomials are the same.
I have already found via computation that the characteristic polynomial is $p_A(x)=x^3-ax^2-bx-c$, and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I'd be done.
The problem is that, solving for the eigenvalues of this (very general) cubic equation is difficult (though, possible), meaning it would be difficult to find bases for the eigenspaces.
A hint would be appreciated.
linear-algebra eigenvalues-eigenvectors jordan-normal-form
$endgroup$
add a comment |
$begingroup$
Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
I wish to show that the characteristic and minimal polynomials are the same.
I have already found via computation that the characteristic polynomial is $p_A(x)=x^3-ax^2-bx-c$, and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I'd be done.
The problem is that, solving for the eigenvalues of this (very general) cubic equation is difficult (though, possible), meaning it would be difficult to find bases for the eigenspaces.
A hint would be appreciated.
linear-algebra eigenvalues-eigenvectors jordan-normal-form
$endgroup$
$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
2 hours ago
$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
2 hours ago
$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
2 hours ago
$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
2 hours ago
add a comment |
$begingroup$
Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
I wish to show that the characteristic and minimal polynomials are the same.
I have already found via computation that the characteristic polynomial is $p_A(x)=x^3-ax^2-bx-c$, and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I'd be done.
The problem is that, solving for the eigenvalues of this (very general) cubic equation is difficult (though, possible), meaning it would be difficult to find bases for the eigenspaces.
A hint would be appreciated.
linear-algebra eigenvalues-eigenvectors jordan-normal-form
$endgroup$
Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
I wish to show that the characteristic and minimal polynomials are the same.
I have already found via computation that the characteristic polynomial is $p_A(x)=x^3-ax^2-bx-c$, and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I'd be done.
The problem is that, solving for the eigenvalues of this (very general) cubic equation is difficult (though, possible), meaning it would be difficult to find bases for the eigenspaces.
A hint would be appreciated.
linear-algebra eigenvalues-eigenvectors jordan-normal-form
linear-algebra eigenvalues-eigenvectors jordan-normal-form
edited 2 hours ago
zz20s
asked 2 hours ago
zz20szz20s
5,28641936
5,28641936
$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
2 hours ago
$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
2 hours ago
$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
2 hours ago
$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
2 hours ago
add a comment |
$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
2 hours ago
$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
2 hours ago
$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
2 hours ago
$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
2 hours ago
$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
2 hours ago
$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
2 hours ago
$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
2 hours ago
$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
2 hours ago
$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
2 hours ago
$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
2 hours ago
$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
2 hours ago
$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
2 hours ago
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
Compute:
$$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
$$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & cendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
$$1 = 0p + 0q,$$
which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial
$endgroup$
$begingroup$
Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
$endgroup$
– zz20s
1 hour ago
1
$begingroup$
Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
$endgroup$
– N. S.
1 hour ago
$begingroup$
@zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
$endgroup$
– Theo Bendit
1 hour ago
$begingroup$
Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
$endgroup$
– zz20s
1 hour ago
$begingroup$
It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
$endgroup$
– Theo Bendit
1 hour ago
|
show 1 more comment
$begingroup$
The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.
For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.
The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.
$endgroup$
add a comment |
$begingroup$
Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:
- Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$
Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.
Applying $m(A)$ to $e_1$ gives
$$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3229044%2fshow-that-the-characteristic-polynomial-is-the-same-as-the-minimal-polynomial%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Compute:
$$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
$$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & cendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
$$1 = 0p + 0q,$$
which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial
$endgroup$
$begingroup$
Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
$endgroup$
– zz20s
1 hour ago
1
$begingroup$
Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
$endgroup$
– N. S.
1 hour ago
$begingroup$
@zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
$endgroup$
– Theo Bendit
1 hour ago
$begingroup$
Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
$endgroup$
– zz20s
1 hour ago
$begingroup$
It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
$endgroup$
– Theo Bendit
1 hour ago
|
show 1 more comment
$begingroup$
Compute:
$$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
$$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & cendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
$$1 = 0p + 0q,$$
which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial
$endgroup$
$begingroup$
Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
$endgroup$
– zz20s
1 hour ago
1
$begingroup$
Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
$endgroup$
– N. S.
1 hour ago
$begingroup$
@zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
$endgroup$
– Theo Bendit
1 hour ago
$begingroup$
Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
$endgroup$
– zz20s
1 hour ago
$begingroup$
It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
$endgroup$
– Theo Bendit
1 hour ago
|
show 1 more comment
$begingroup$
Compute:
$$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
$$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & cendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
$$1 = 0p + 0q,$$
which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial
$endgroup$
Compute:
$$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
$$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & cendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
$$1 = 0p + 0q,$$
which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial
answered 2 hours ago
Theo BenditTheo Bendit
22.4k12358
22.4k12358
$begingroup$
Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
$endgroup$
– zz20s
1 hour ago
1
$begingroup$
Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
$endgroup$
– N. S.
1 hour ago
$begingroup$
@zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
$endgroup$
– Theo Bendit
1 hour ago
$begingroup$
Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
$endgroup$
– zz20s
1 hour ago
$begingroup$
It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
$endgroup$
– Theo Bendit
1 hour ago
|
show 1 more comment
$begingroup$
Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
$endgroup$
– zz20s
1 hour ago
1
$begingroup$
Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
$endgroup$
– N. S.
1 hour ago
$begingroup$
@zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
$endgroup$
– Theo Bendit
1 hour ago
$begingroup$
Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
$endgroup$
– zz20s
1 hour ago
$begingroup$
It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
$endgroup$
– Theo Bendit
1 hour ago
$begingroup$
Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
$endgroup$
– zz20s
1 hour ago
$begingroup$
Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
$endgroup$
– zz20s
1 hour ago
1
1
$begingroup$
Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
$endgroup$
– N. S.
1 hour ago
$begingroup$
Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
$endgroup$
– N. S.
1 hour ago
$begingroup$
@zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
$endgroup$
– Theo Bendit
1 hour ago
$begingroup$
@zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
$endgroup$
– Theo Bendit
1 hour ago
$begingroup$
Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
$endgroup$
– zz20s
1 hour ago
$begingroup$
Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
$endgroup$
– zz20s
1 hour ago
$begingroup$
It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
$endgroup$
– Theo Bendit
1 hour ago
$begingroup$
It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
$endgroup$
– Theo Bendit
1 hour ago
|
show 1 more comment
$begingroup$
The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.
For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.
The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.
$endgroup$
add a comment |
$begingroup$
The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.
For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.
The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.
$endgroup$
add a comment |
$begingroup$
The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.
For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.
The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.
$endgroup$
The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.
For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.
The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.
answered 41 mins ago
user52817user52817
1292
1292
add a comment |
add a comment |
$begingroup$
Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:
- Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$
Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.
Applying $m(A)$ to $e_1$ gives
$$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.
$endgroup$
add a comment |
$begingroup$
Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:
- Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$
Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.
Applying $m(A)$ to $e_1$ gives
$$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.
$endgroup$
add a comment |
$begingroup$
Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:
- Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$
Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.
Applying $m(A)$ to $e_1$ gives
$$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.
$endgroup$
Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:
- Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$
Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.
Applying $m(A)$ to $e_1$ gives
$$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.
answered 49 mins ago
trancelocationtrancelocation
15.1k1929
15.1k1929
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3229044%2fshow-that-the-characteristic-polynomial-is-the-same-as-the-minimal-polynomial%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
2 hours ago
$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
2 hours ago
$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
2 hours ago
$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
2 hours ago