Show that the characteristic polynomial is the same as the minimal polynomialWhen are minimal and characteristic polynomials the same?Commutation when minimal and characteristic polynomial agreeMinimal polynomials and characteristic polynomialsCharacteristic polynomial divides minimal polynomial if and only if all eigenspaces are one-dimensional$3 times 3$ matrices completely determined by their characteristic and minimal polynomialsTheorem on characteristic polynomials and minimal polynomials.Minimal Polynomial VS Jordan Normal Form.$textdet(I-AB)=textdet(I-BA)$Find minimal Polynomial of matrixProof: Characteristic polynomial expressed two different ways equals same polynomial.Showing two matrices have the same minimal polynomial

Are there any symmetric cryptosystems based on computational complexity assumptions?

Why are stats in Angband written as 18/** instead of 19, 20...?

Failing students when it might cause them economic ruin

Why aren't satellites disintegrated even though they orbit earth within earth's Roche Limits?

How does this piece of code determine array size without using sizeof( )?

Why do academics prefer Mac/Linux?

Who is frowning in the sentence "Daisy looked at Tom frowning"?

Should I twist DC power and ground wires from a power supply?

Is my homebrew Awakened Bear race balanced?

Why does string strummed with finger sound different from the one strummed with pick?

Show that the characteristic polynomial is the same as the minimal polynomial

Have the writers and actors of GOT responded to its poor reception?

How do we explain the use of a software on a math paper?

Why use a retrograde orbit?

How do you cope with rejection?

Why does the setUID bit work inconsistently?

How to customize the pie chart background in PowerPoint?

Driving a school bus in the USA

When did Britain learn about the American Declaration of Independence?

Why does a table with a defined constant in its index compute 10X slower?

Why is Drogon so much better in battle than Rhaegal and Viserion?

Largest memory peripheral for Sinclair ZX81?

How was the blinking terminal cursor invented?

Lock out of Oracle based on Windows username



Show that the characteristic polynomial is the same as the minimal polynomial


When are minimal and characteristic polynomials the same?Commutation when minimal and characteristic polynomial agreeMinimal polynomials and characteristic polynomialsCharacteristic polynomial divides minimal polynomial if and only if all eigenspaces are one-dimensional$3 times 3$ matrices completely determined by their characteristic and minimal polynomialsTheorem on characteristic polynomials and minimal polynomials.Minimal Polynomial VS Jordan Normal Form.$textdet(I-AB)=textdet(I-BA)$Find minimal Polynomial of matrixProof: Characteristic polynomial expressed two different ways equals same polynomial.Showing two matrices have the same minimal polynomial













4












$begingroup$


Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
I wish to show that the characteristic and minimal polynomials are the same.



I have already found via computation that the characteristic polynomial is $p_A(x)=x^3-ax^2-bx-c$, and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I'd be done.



The problem is that, solving for the eigenvalues of this (very general) cubic equation is difficult (though, possible), meaning it would be difficult to find bases for the eigenspaces.



A hint would be appreciated.










share|cite|improve this question











$endgroup$











  • $begingroup$
    How about calculate $det(xI-A)$ ?
    $endgroup$
    – Rodrigo Dias
    2 hours ago










  • $begingroup$
    @zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
    $endgroup$
    – Theo Bendit
    2 hours ago










  • $begingroup$
    You said the minimal polynomial has degree $3$
    $endgroup$
    – J. W. Tanner
    2 hours ago










  • $begingroup$
    Oh, yes, sorry, that should say characteristic.
    $endgroup$
    – zz20s
    2 hours ago















4












$begingroup$


Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
I wish to show that the characteristic and minimal polynomials are the same.



I have already found via computation that the characteristic polynomial is $p_A(x)=x^3-ax^2-bx-c$, and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I'd be done.



The problem is that, solving for the eigenvalues of this (very general) cubic equation is difficult (though, possible), meaning it would be difficult to find bases for the eigenspaces.



A hint would be appreciated.










share|cite|improve this question











$endgroup$











  • $begingroup$
    How about calculate $det(xI-A)$ ?
    $endgroup$
    – Rodrigo Dias
    2 hours ago










  • $begingroup$
    @zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
    $endgroup$
    – Theo Bendit
    2 hours ago










  • $begingroup$
    You said the minimal polynomial has degree $3$
    $endgroup$
    – J. W. Tanner
    2 hours ago










  • $begingroup$
    Oh, yes, sorry, that should say characteristic.
    $endgroup$
    – zz20s
    2 hours ago













4












4








4


2



$begingroup$


Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
I wish to show that the characteristic and minimal polynomials are the same.



I have already found via computation that the characteristic polynomial is $p_A(x)=x^3-ax^2-bx-c$, and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I'd be done.



The problem is that, solving for the eigenvalues of this (very general) cubic equation is difficult (though, possible), meaning it would be difficult to find bases for the eigenspaces.



A hint would be appreciated.










share|cite|improve this question











$endgroup$




Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
I wish to show that the characteristic and minimal polynomials are the same.



I have already found via computation that the characteristic polynomial is $p_A(x)=x^3-ax^2-bx-c$, and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I'd be done.



The problem is that, solving for the eigenvalues of this (very general) cubic equation is difficult (though, possible), meaning it would be difficult to find bases for the eigenspaces.



A hint would be appreciated.







linear-algebra eigenvalues-eigenvectors jordan-normal-form






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 2 hours ago







zz20s

















asked 2 hours ago









zz20szz20s

5,28641936




5,28641936











  • $begingroup$
    How about calculate $det(xI-A)$ ?
    $endgroup$
    – Rodrigo Dias
    2 hours ago










  • $begingroup$
    @zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
    $endgroup$
    – Theo Bendit
    2 hours ago










  • $begingroup$
    You said the minimal polynomial has degree $3$
    $endgroup$
    – J. W. Tanner
    2 hours ago










  • $begingroup$
    Oh, yes, sorry, that should say characteristic.
    $endgroup$
    – zz20s
    2 hours ago
















  • $begingroup$
    How about calculate $det(xI-A)$ ?
    $endgroup$
    – Rodrigo Dias
    2 hours ago










  • $begingroup$
    @zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
    $endgroup$
    – Theo Bendit
    2 hours ago










  • $begingroup$
    You said the minimal polynomial has degree $3$
    $endgroup$
    – J. W. Tanner
    2 hours ago










  • $begingroup$
    Oh, yes, sorry, that should say characteristic.
    $endgroup$
    – zz20s
    2 hours ago















$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
2 hours ago




$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
2 hours ago












$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
2 hours ago




$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
2 hours ago












$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
2 hours ago




$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
2 hours ago












$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
2 hours ago




$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
2 hours ago










3 Answers
3






active

oldest

votes


















4












$begingroup$

Compute:
$$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
$$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & cendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
$$1 = 0p + 0q,$$
which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial






share|cite|improve this answer









$endgroup$












  • $begingroup$
    Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
    $endgroup$
    – zz20s
    1 hour ago






  • 1




    $begingroup$
    Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
    $endgroup$
    – N. S.
    1 hour ago










  • $begingroup$
    @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
    $endgroup$
    – Theo Bendit
    1 hour ago











  • $begingroup$
    Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
    $endgroup$
    – zz20s
    1 hour ago










  • $begingroup$
    It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
    $endgroup$
    – Theo Bendit
    1 hour ago


















2












$begingroup$

The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.






share|cite|improve this answer









$endgroup$




















    1












    $begingroup$

    Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



    • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

    Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



    Applying $m(A)$ to $e_1$ gives
    $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
    The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.






    share|cite|improve this answer









    $endgroup$













      Your Answer








      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "69"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader:
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      ,
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );













      draft saved

      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3229044%2fshow-that-the-characteristic-polynomial-is-the-same-as-the-minimal-polynomial%23new-answer', 'question_page');

      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      4












      $begingroup$

      Compute:
      $$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
      So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
      $$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & cendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
      for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
      $$1 = 0p + 0q,$$
      which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial






      share|cite|improve this answer









      $endgroup$












      • $begingroup$
        Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
        $endgroup$
        – zz20s
        1 hour ago






      • 1




        $begingroup$
        Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
        $endgroup$
        – N. S.
        1 hour ago










      • $begingroup$
        @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
        $endgroup$
        – Theo Bendit
        1 hour ago











      • $begingroup$
        Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
        $endgroup$
        – zz20s
        1 hour ago










      • $begingroup$
        It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
        $endgroup$
        – Theo Bendit
        1 hour ago















      4












      $begingroup$

      Compute:
      $$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
      So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
      $$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & cendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
      for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
      $$1 = 0p + 0q,$$
      which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial






      share|cite|improve this answer









      $endgroup$












      • $begingroup$
        Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
        $endgroup$
        – zz20s
        1 hour ago






      • 1




        $begingroup$
        Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
        $endgroup$
        – N. S.
        1 hour ago










      • $begingroup$
        @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
        $endgroup$
        – Theo Bendit
        1 hour ago











      • $begingroup$
        Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
        $endgroup$
        – zz20s
        1 hour ago










      • $begingroup$
        It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
        $endgroup$
        – Theo Bendit
        1 hour ago













      4












      4








      4





      $begingroup$

      Compute:
      $$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
      So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
      $$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & cendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
      for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
      $$1 = 0p + 0q,$$
      which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial






      share|cite|improve this answer









      $endgroup$



      Compute:
      $$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
      So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
      $$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & cendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
      for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
      $$1 = 0p + 0q,$$
      which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial







      share|cite|improve this answer












      share|cite|improve this answer



      share|cite|improve this answer










      answered 2 hours ago









      Theo BenditTheo Bendit

      22.4k12358




      22.4k12358











      • $begingroup$
        Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
        $endgroup$
        – zz20s
        1 hour ago






      • 1




        $begingroup$
        Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
        $endgroup$
        – N. S.
        1 hour ago










      • $begingroup$
        @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
        $endgroup$
        – Theo Bendit
        1 hour ago











      • $begingroup$
        Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
        $endgroup$
        – zz20s
        1 hour ago










      • $begingroup$
        It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
        $endgroup$
        – Theo Bendit
        1 hour ago
















      • $begingroup$
        Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
        $endgroup$
        – zz20s
        1 hour ago






      • 1




        $begingroup$
        Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
        $endgroup$
        – N. S.
        1 hour ago










      • $begingroup$
        @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
        $endgroup$
        – Theo Bendit
        1 hour ago











      • $begingroup$
        Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
        $endgroup$
        – zz20s
        1 hour ago










      • $begingroup$
        It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
        $endgroup$
        – Theo Bendit
        1 hour ago















      $begingroup$
      Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
      $endgroup$
      – zz20s
      1 hour ago




      $begingroup$
      Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
      $endgroup$
      – zz20s
      1 hour ago




      1




      1




      $begingroup$
      Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
      $endgroup$
      – N. S.
      1 hour ago




      $begingroup$
      Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
      $endgroup$
      – N. S.
      1 hour ago












      $begingroup$
      @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
      $endgroup$
      – Theo Bendit
      1 hour ago





      $begingroup$
      @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
      $endgroup$
      – Theo Bendit
      1 hour ago













      $begingroup$
      Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
      $endgroup$
      – zz20s
      1 hour ago




      $begingroup$
      Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
      $endgroup$
      – zz20s
      1 hour ago












      $begingroup$
      It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
      $endgroup$
      – Theo Bendit
      1 hour ago




      $begingroup$
      It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
      $endgroup$
      – Theo Bendit
      1 hour ago











      2












      $begingroup$

      The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



      For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



      The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.






      share|cite|improve this answer









      $endgroup$

















        2












        $begingroup$

        The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



        For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



        The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.






        share|cite|improve this answer









        $endgroup$















          2












          2








          2





          $begingroup$

          The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



          For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



          The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.






          share|cite|improve this answer









          $endgroup$



          The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



          For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



          The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered 41 mins ago









          user52817user52817

          1292




          1292





















              1












              $begingroup$

              Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



              • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

              Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



              Applying $m(A)$ to $e_1$ gives
              $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
              The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.






              share|cite|improve this answer









              $endgroup$

















                1












                $begingroup$

                Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



                • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

                Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



                Applying $m(A)$ to $e_1$ gives
                $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
                The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.






                share|cite|improve this answer









                $endgroup$















                  1












                  1








                  1





                  $begingroup$

                  Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



                  • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

                  Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



                  Applying $m(A)$ to $e_1$ gives
                  $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
                  The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.






                  share|cite|improve this answer









                  $endgroup$



                  Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



                  • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

                  Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



                  Applying $m(A)$ to $e_1$ gives
                  $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
                  The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered 49 mins ago









                  trancelocationtrancelocation

                  15.1k1929




                  15.1k1929



























                      draft saved

                      draft discarded
















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid


                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.

                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3229044%2fshow-that-the-characteristic-polynomial-is-the-same-as-the-minimal-polynomial%23new-answer', 'question_page');

                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Canceling a color specificationRandomly assigning color to Graphics3D objects?Default color for Filling in Mathematica 9Coloring specific elements of sets with a prime modified order in an array plotHow to pick a color differing significantly from the colors already in a given color list?Detection of the text colorColor numbers based on their valueCan color schemes for use with ColorData include opacity specification?My dynamic color schemes

                      Invision Community Contents History See also References External links Navigation menuProprietaryinvisioncommunity.comIPS Community ForumsIPS Community Forumsthis blog entry"License Changes, IP.Board 3.4, and the Future""Interview -- Matt Mecham of Ibforums""CEO Invision Power Board, Matt Mecham Is a Liar, Thief!"IPB License Explanation 1.3, 1.3.1, 2.0, and 2.1ArchivedSecurity Fixes, Updates And Enhancements For IPB 1.3.1Archived"New Demo Accounts - Invision Power Services"the original"New Default Skin"the original"Invision Power Board 3.0.0 and Applications Released"the original"Archived copy"the original"Perpetual licenses being done away with""Release Notes - Invision Power Services""Introducing: IPS Community Suite 4!"Invision Community Release Notes

                      199年 目錄 大件事 到箇年出世嗰人 到箇年死嗰人 節慶、風俗習慣 導覽選單