what is the log of the PDF for a Normal Distribution? Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)How to solve/compute for normal distribution and log-normal CDF inverse?Distribution of the convolution of squared normal and chi-squared variables?Cramer's theorem for a precise normal asymptotic distributionConditional Expected Value of Product of Normal and Log-Normal DistributionAsymptotic relation for a class of probability distribution functionsShow that $Y_1+Y_2$ have distribution skew-normalExpected Fisher's information matrix for Student's t-distribution?Expected Value of Maximum likelihood mean for Gaussian DistributionJoint density of the sum of a random and a non-random variable?Reversing conditional distribution

Google .dev domain strangely redirects to https

How does the math work when buying airline miles?

What is the origin of 落第?

Can two person see the same photon?

Is there hard evidence that the grant peer review system performs significantly better than random?

Is there public access to the Meteor Crater in Arizona?

NERDTreeMenu Remapping

What does 丫 mean? 丫是什么意思?

What does Turing mean by this statement?

How can a team of shapeshifters communicate?

The test team as an enemy of development? And how can this be avoided?

How many time has Arya actually used Needle?

Does the Black Tentacles spell do damage twice at the start of turn to an already restrained creature?

My mentor says to set image to Fine instead of RAW — how is this different from JPG?

I can't produce songs

"klopfte jemand" or "jemand klopfte"?

Universal covering space of the real projective line?

If Windows 7 doesn't support WSL, then what is "Subsystem for UNIX-based Applications"?

Tips to organize LaTeX presentations for a semester

How do living politicians protect their readily obtainable signatures from misuse?

Positioning dot before text in math mode

The Nth Gryphon Number

What is the difference between CTSS and ITS?

Why is it faster to reheat something than it is to cook it?



what is the log of the PDF for a Normal Distribution?



Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)How to solve/compute for normal distribution and log-normal CDF inverse?Distribution of the convolution of squared normal and chi-squared variables?Cramer's theorem for a precise normal asymptotic distributionConditional Expected Value of Product of Normal and Log-Normal DistributionAsymptotic relation for a class of probability distribution functionsShow that $Y_1+Y_2$ have distribution skew-normalExpected Fisher's information matrix for Student's t-distribution?Expected Value of Maximum likelihood mean for Gaussian DistributionJoint density of the sum of a random and a non-random variable?Reversing conditional distribution



.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








1












$begingroup$


I am learning Maximum Likelihood Estimation.



per this post, the log of the PDF for a Normal Distribution looks like this.



enter image description here



let's call this equation1.



according to any probability theory textbook the formula of the PDF for a Normal Distribution:



$$
frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2
,-infty <x<infty
$$



taking log produces:



beginalign
ln(frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2) &=
ln(frac 1sigma sqrt 2pi)+ln(e^-frac (x - mu)^22sigma ^2)\
&=-ln(sigma)-frac12 ln(2pi) - frac (x - mu)^22sigma ^2
endalign



which is very different from equation1.



is equation1 right? what am I missing?










share|cite|improve this question









$endgroup$







  • 3




    $begingroup$
    Your first equation is the joint log-pdf of a sample of n iid normal random variables (AKA the log-likelihood of that sample). The second equation is the the log-pdf of a single normal random variable
    $endgroup$
    – Artem Mavrin
    1 hour ago











  • $begingroup$
    @ArtemMavrin, I think your comment would be a perfectly good answer if you expanded on just a bit to make it slightly more clear.
    $endgroup$
    – StatsStudent
    1 hour ago

















1












$begingroup$


I am learning Maximum Likelihood Estimation.



per this post, the log of the PDF for a Normal Distribution looks like this.



enter image description here



let's call this equation1.



according to any probability theory textbook the formula of the PDF for a Normal Distribution:



$$
frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2
,-infty <x<infty
$$



taking log produces:



beginalign
ln(frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2) &=
ln(frac 1sigma sqrt 2pi)+ln(e^-frac (x - mu)^22sigma ^2)\
&=-ln(sigma)-frac12 ln(2pi) - frac (x - mu)^22sigma ^2
endalign



which is very different from equation1.



is equation1 right? what am I missing?










share|cite|improve this question









$endgroup$







  • 3




    $begingroup$
    Your first equation is the joint log-pdf of a sample of n iid normal random variables (AKA the log-likelihood of that sample). The second equation is the the log-pdf of a single normal random variable
    $endgroup$
    – Artem Mavrin
    1 hour ago











  • $begingroup$
    @ArtemMavrin, I think your comment would be a perfectly good answer if you expanded on just a bit to make it slightly more clear.
    $endgroup$
    – StatsStudent
    1 hour ago













1












1








1





$begingroup$


I am learning Maximum Likelihood Estimation.



per this post, the log of the PDF for a Normal Distribution looks like this.



enter image description here



let's call this equation1.



according to any probability theory textbook the formula of the PDF for a Normal Distribution:



$$
frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2
,-infty <x<infty
$$



taking log produces:



beginalign
ln(frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2) &=
ln(frac 1sigma sqrt 2pi)+ln(e^-frac (x - mu)^22sigma ^2)\
&=-ln(sigma)-frac12 ln(2pi) - frac (x - mu)^22sigma ^2
endalign



which is very different from equation1.



is equation1 right? what am I missing?










share|cite|improve this question









$endgroup$




I am learning Maximum Likelihood Estimation.



per this post, the log of the PDF for a Normal Distribution looks like this.



enter image description here



let's call this equation1.



according to any probability theory textbook the formula of the PDF for a Normal Distribution:



$$
frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2
,-infty <x<infty
$$



taking log produces:



beginalign
ln(frac 1sigma sqrt 2pi
e^-frac (x - mu)^22sigma ^2) &=
ln(frac 1sigma sqrt 2pi)+ln(e^-frac (x - mu)^22sigma ^2)\
&=-ln(sigma)-frac12 ln(2pi) - frac (x - mu)^22sigma ^2
endalign



which is very different from equation1.



is equation1 right? what am I missing?







probability log






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 1 hour ago









shi95shi95

83




83







  • 3




    $begingroup$
    Your first equation is the joint log-pdf of a sample of n iid normal random variables (AKA the log-likelihood of that sample). The second equation is the the log-pdf of a single normal random variable
    $endgroup$
    – Artem Mavrin
    1 hour ago











  • $begingroup$
    @ArtemMavrin, I think your comment would be a perfectly good answer if you expanded on just a bit to make it slightly more clear.
    $endgroup$
    – StatsStudent
    1 hour ago












  • 3




    $begingroup$
    Your first equation is the joint log-pdf of a sample of n iid normal random variables (AKA the log-likelihood of that sample). The second equation is the the log-pdf of a single normal random variable
    $endgroup$
    – Artem Mavrin
    1 hour ago











  • $begingroup$
    @ArtemMavrin, I think your comment would be a perfectly good answer if you expanded on just a bit to make it slightly more clear.
    $endgroup$
    – StatsStudent
    1 hour ago







3




3




$begingroup$
Your first equation is the joint log-pdf of a sample of n iid normal random variables (AKA the log-likelihood of that sample). The second equation is the the log-pdf of a single normal random variable
$endgroup$
– Artem Mavrin
1 hour ago





$begingroup$
Your first equation is the joint log-pdf of a sample of n iid normal random variables (AKA the log-likelihood of that sample). The second equation is the the log-pdf of a single normal random variable
$endgroup$
– Artem Mavrin
1 hour ago













$begingroup$
@ArtemMavrin, I think your comment would be a perfectly good answer if you expanded on just a bit to make it slightly more clear.
$endgroup$
– StatsStudent
1 hour ago




$begingroup$
@ArtemMavrin, I think your comment would be a perfectly good answer if you expanded on just a bit to make it slightly more clear.
$endgroup$
– StatsStudent
1 hour ago










1 Answer
1






active

oldest

votes


















2












$begingroup$

For a single observed value $x$ you have log-likelihood:



$$ell_x(mu,sigma^2) = - ln sigma - frac12 ln (2 pi) - frac12 Big( fracx-musigma Big)^2.$$



For a sample of observed values $mathbfx = (x_1,...,x_n)$ you then have:



$$ell_mathbfx(mu,sigma^2) = sum_i=1^n ell_x(mu,sigma^2) = - n ln sigma - fracn2 ln (2 pi) - frac12 sigma^2 sum_i=1^n (x_i-mu)^2.$$






share|cite|improve this answer









$endgroup$













    Your Answer








    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "65"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f404191%2fwhat-is-the-log-of-the-pdf-for-a-normal-distribution%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2












    $begingroup$

    For a single observed value $x$ you have log-likelihood:



    $$ell_x(mu,sigma^2) = - ln sigma - frac12 ln (2 pi) - frac12 Big( fracx-musigma Big)^2.$$



    For a sample of observed values $mathbfx = (x_1,...,x_n)$ you then have:



    $$ell_mathbfx(mu,sigma^2) = sum_i=1^n ell_x(mu,sigma^2) = - n ln sigma - fracn2 ln (2 pi) - frac12 sigma^2 sum_i=1^n (x_i-mu)^2.$$






    share|cite|improve this answer









    $endgroup$

















      2












      $begingroup$

      For a single observed value $x$ you have log-likelihood:



      $$ell_x(mu,sigma^2) = - ln sigma - frac12 ln (2 pi) - frac12 Big( fracx-musigma Big)^2.$$



      For a sample of observed values $mathbfx = (x_1,...,x_n)$ you then have:



      $$ell_mathbfx(mu,sigma^2) = sum_i=1^n ell_x(mu,sigma^2) = - n ln sigma - fracn2 ln (2 pi) - frac12 sigma^2 sum_i=1^n (x_i-mu)^2.$$






      share|cite|improve this answer









      $endgroup$















        2












        2








        2





        $begingroup$

        For a single observed value $x$ you have log-likelihood:



        $$ell_x(mu,sigma^2) = - ln sigma - frac12 ln (2 pi) - frac12 Big( fracx-musigma Big)^2.$$



        For a sample of observed values $mathbfx = (x_1,...,x_n)$ you then have:



        $$ell_mathbfx(mu,sigma^2) = sum_i=1^n ell_x(mu,sigma^2) = - n ln sigma - fracn2 ln (2 pi) - frac12 sigma^2 sum_i=1^n (x_i-mu)^2.$$






        share|cite|improve this answer









        $endgroup$



        For a single observed value $x$ you have log-likelihood:



        $$ell_x(mu,sigma^2) = - ln sigma - frac12 ln (2 pi) - frac12 Big( fracx-musigma Big)^2.$$



        For a sample of observed values $mathbfx = (x_1,...,x_n)$ you then have:



        $$ell_mathbfx(mu,sigma^2) = sum_i=1^n ell_x(mu,sigma^2) = - n ln sigma - fracn2 ln (2 pi) - frac12 sigma^2 sum_i=1^n (x_i-mu)^2.$$







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 58 mins ago









        BenBen

        28.9k233129




        28.9k233129



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Cross Validated!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f404191%2fwhat-is-the-log-of-the-pdf-for-a-normal-distribution%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Canceling a color specificationRandomly assigning color to Graphics3D objects?Default color for Filling in Mathematica 9Coloring specific elements of sets with a prime modified order in an array plotHow to pick a color differing significantly from the colors already in a given color list?Detection of the text colorColor numbers based on their valueCan color schemes for use with ColorData include opacity specification?My dynamic color schemes

            Invision Community Contents History See also References External links Navigation menuProprietaryinvisioncommunity.comIPS Community ForumsIPS Community Forumsthis blog entry"License Changes, IP.Board 3.4, and the Future""Interview -- Matt Mecham of Ibforums""CEO Invision Power Board, Matt Mecham Is a Liar, Thief!"IPB License Explanation 1.3, 1.3.1, 2.0, and 2.1ArchivedSecurity Fixes, Updates And Enhancements For IPB 1.3.1Archived"New Demo Accounts - Invision Power Services"the original"New Default Skin"the original"Invision Power Board 3.0.0 and Applications Released"the original"Archived copy"the original"Perpetual licenses being done away with""Release Notes - Invision Power Services""Introducing: IPS Community Suite 4!"Invision Community Release Notes

            199年 目錄 大件事 到箇年出世嗰人 到箇年死嗰人 節慶、風俗習慣 導覽選單