Absolutely wonderful numerical phenomenon. Who can explain?Maximum likelihood estimate of hypergeometric distribution parameterFinding subset of combinations which satisfy a criterionCan anyone explain one step of derivation in a branching process example?How can I predict the next number give prior history of known sequences?Can you explain this solution?Can someone explain to me why hot hand phenomenon is considered a fallacy?Can anyone explain this dependent probability statement.Is there a higher chance of winning one contest if you enter many? How can we determine when it's most mathematically favorable then?

Can I conceal an antihero's insanity - and should I?

Parallel resistance in electric circuits

How are aircraft depainted?

What officially disallows US presidents from driving?

Examples of proofs by making reduction to a finite set

The Planck constant for mathematicians

If the gambler's fallacy is false, how do notions of "expected number" of events work?

How To Make Earth's Oceans as Brackish as Lyr's

Why are some files not movable on Windows 10?

Which is the current decimal separator?

What is a "major country" as named in Bernie Sanders' Healthcare debate answers?

Where is it? - The Google Earth Challenge Ep. 2

Can a character with good/neutral alignment attune to a sentient object with evil alignment?

Can I fix my boots by gluing the soles back on?

What organs or modifications would be needed for a life biological creature not to require sleep?

Is a suit against a University Dorm for changing policies on a whim likely to succeed (USA)?

What was the motivation for the invention of electric pianos?

Usage of blank space in trade banner and text-positioning

How can I discourage sharing internal API keys within a company?

Consonance v. Dissonance

Permutations in Disguise

What do the French say for “Oh, you shouldn’t have”?

In what state are satellites left in when they are left in a graveyard orbit?

How to publish superseding results without creating enemies



Absolutely wonderful numerical phenomenon. Who can explain?


Maximum likelihood estimate of hypergeometric distribution parameterFinding subset of combinations which satisfy a criterionCan anyone explain one step of derivation in a branching process example?How can I predict the next number give prior history of known sequences?Can you explain this solution?Can someone explain to me why hot hand phenomenon is considered a fallacy?Can anyone explain this dependent probability statement.Is there a higher chance of winning one contest if you enter many? How can we determine when it's most mathematically favorable then?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








11












$begingroup$


I was doing some software engineering and wanted to have a thread do something in the background to basically just waste CPU time for a certain test. While I could have done something really boring like for(i < 10000000) j = 2 * i , I ended up having the program start with $1$, and then for a million steps choose a random real number r in the interval $[0,R]$ (uniformly distributed) and multiply the result by r at each step. When $R = 2$, it converged to $0$. When $R = 3$, it exploded to infinity. So of course, the question anyone with a modicum of curiosity would ask: for what $R$ do we have the transition. And then, I tried the first number between $2$ and $3$ that we would all think of, Euler's number $e$, and sure enough, this conjecture was right. Would love to see a proof of this.



Now when I should be working, I'm instead wondering about the behavior of this script. Ironically, rather than wasting my CPUs time, I'm wasting my own time. But it's a beautiful phenomenon. I don't regret it. :)










share|cite|improve this question











$endgroup$









  • 1




    $begingroup$
    If the threshold really is $e$, I'm ready for my mind to be blown.
    $endgroup$
    – littleO
    8 hours ago

















11












$begingroup$


I was doing some software engineering and wanted to have a thread do something in the background to basically just waste CPU time for a certain test. While I could have done something really boring like for(i < 10000000) j = 2 * i , I ended up having the program start with $1$, and then for a million steps choose a random real number r in the interval $[0,R]$ (uniformly distributed) and multiply the result by r at each step. When $R = 2$, it converged to $0$. When $R = 3$, it exploded to infinity. So of course, the question anyone with a modicum of curiosity would ask: for what $R$ do we have the transition. And then, I tried the first number between $2$ and $3$ that we would all think of, Euler's number $e$, and sure enough, this conjecture was right. Would love to see a proof of this.



Now when I should be working, I'm instead wondering about the behavior of this script. Ironically, rather than wasting my CPUs time, I'm wasting my own time. But it's a beautiful phenomenon. I don't regret it. :)










share|cite|improve this question











$endgroup$









  • 1




    $begingroup$
    If the threshold really is $e$, I'm ready for my mind to be blown.
    $endgroup$
    – littleO
    8 hours ago













11












11








11


3



$begingroup$


I was doing some software engineering and wanted to have a thread do something in the background to basically just waste CPU time for a certain test. While I could have done something really boring like for(i < 10000000) j = 2 * i , I ended up having the program start with $1$, and then for a million steps choose a random real number r in the interval $[0,R]$ (uniformly distributed) and multiply the result by r at each step. When $R = 2$, it converged to $0$. When $R = 3$, it exploded to infinity. So of course, the question anyone with a modicum of curiosity would ask: for what $R$ do we have the transition. And then, I tried the first number between $2$ and $3$ that we would all think of, Euler's number $e$, and sure enough, this conjecture was right. Would love to see a proof of this.



Now when I should be working, I'm instead wondering about the behavior of this script. Ironically, rather than wasting my CPUs time, I'm wasting my own time. But it's a beautiful phenomenon. I don't regret it. :)










share|cite|improve this question











$endgroup$




I was doing some software engineering and wanted to have a thread do something in the background to basically just waste CPU time for a certain test. While I could have done something really boring like for(i < 10000000) j = 2 * i , I ended up having the program start with $1$, and then for a million steps choose a random real number r in the interval $[0,R]$ (uniformly distributed) and multiply the result by r at each step. When $R = 2$, it converged to $0$. When $R = 3$, it exploded to infinity. So of course, the question anyone with a modicum of curiosity would ask: for what $R$ do we have the transition. And then, I tried the first number between $2$ and $3$ that we would all think of, Euler's number $e$, and sure enough, this conjecture was right. Would love to see a proof of this.



Now when I should be working, I'm instead wondering about the behavior of this script. Ironically, rather than wasting my CPUs time, I'm wasting my own time. But it's a beautiful phenomenon. I don't regret it. :)







probability stochastic-processes






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 8 hours ago









Sil

6,0412 gold badges17 silver badges46 bronze badges




6,0412 gold badges17 silver badges46 bronze badges










asked 8 hours ago









Jake MirraJake Mirra

1789 bronze badges




1789 bronze badges










  • 1




    $begingroup$
    If the threshold really is $e$, I'm ready for my mind to be blown.
    $endgroup$
    – littleO
    8 hours ago












  • 1




    $begingroup$
    If the threshold really is $e$, I'm ready for my mind to be blown.
    $endgroup$
    – littleO
    8 hours ago







1




1




$begingroup$
If the threshold really is $e$, I'm ready for my mind to be blown.
$endgroup$
– littleO
8 hours ago




$begingroup$
If the threshold really is $e$, I'm ready for my mind to be blown.
$endgroup$
– littleO
8 hours ago










2 Answers
2






active

oldest

votes


















6














$begingroup$

EDIT: I saw that you solved it yourself. Congrats! I'm posting this anyway because I was most of the way through typing it when your answer hit. :)



Infinite products are hard, in general; infinite sums are better, because we have lots of tools at our disposal for handling them. Fortunately, we can always turn a product into a sum via a logarithm.



Let $X_i sim operatornameUniform(0, r)$, and let $Y_n = prod_i=1^n X_i$. Note that $log(Y_n) = sum_i=1^n log(X_i)$. The eventual emergence of $e$ as important is already somewhat clear, even though we haven't really done anything yet.



The more useful formulation here is that $fraclog(Y_n)n = frac 1 n sum log(X_i)$, because we know from the Strong Law of Large Numbers that the right side converges almost surely to $mathbb E[log(X_i)]$. We have
$$mathbb E log(X_i) = int_0^r log(x) cdot frac 1 r , textrm d x = frac 1 r [x log(x) - x] bigg|_0^r = log(r) - 1.$$



If $r < e$, then $log(Y_n) / n to c < 0$, which implies that $log(Y_n) to -infty$, hence $Y_n to 0$. Similarly, if $r > e$, then $log(Y_n) / n to c > 0$, whence $Y_n to infty$. The fun case is: what happens when $r = e$?






share|cite|improve this answer











$endgroup$










  • 1




    $begingroup$
    I accepted your answer, as it is an excellent explanation! Thank you for taking the time!
    $endgroup$
    – Jake Mirra
    8 hours ago










  • $begingroup$
    Was thinking a bit about your question of "what happens when r = e", and all I can say is that, once you look at it on a logarithmic scale, it's a weird, sort of lopsided random walk through the reals where you sometimes take giant steps backwards and then lots of small steps forward.
    $endgroup$
    – Jake Mirra
    7 hours ago










  • $begingroup$
    Yep! And you can convince yourself that even though those increments are unbounded (on the negative side), they still have a finite variance...
    $endgroup$
    – Aaron Montgomery
    7 hours ago










  • $begingroup$
    @AaronMontgomery - what happens when $r=e$? I am not good with the details of probability theory. Does $Y_n$ converge (to $1$) or does it not converge? And what has the finite variance (of $log X_i$) got to do with it? Intuitively I would guess the sequence does not converge, but your mention of finite variance seems to hint that it would...
    $endgroup$
    – antkam
    5 hours ago











  • $begingroup$
    When $r = e$, the fact that we are taking an average of the $log(X_i)$ variables (which have finite variance) means that we can use the Central Limit Theorem to proceed. This implies that $sqrt n overline X$ converges (in distribution only, NOT almost surely) to a normal variable with mean $0$ and variance $sigma^2$ (i.e. the variance of $log(X_i)$), so $log(Y_n)/sqrt n$ does the same. Consequently, $Y_n$ just becomes diffuse, and on individual realizations it will wander, much like an ordinary random walk will do.
    $endgroup$
    – Aaron Montgomery
    5 hours ago


















5














$begingroup$

I found the answer! One starts with the uniform distribution on $ [0,R] $. The natural logarithm pushes this distribution forward to a distribution on $ (-infty, ln(R) ] $ with density function given by $ p(y) = e^y / R, y in (-infty, ln(R)] $. The expected value of this distribution is $ int_-infty^ln(R)cfracy e^yR dy = ln(R) - 1 $. Solving for zero gives the answer to the riddle! Love it!






share|cite|improve this answer









$endgroup$

















    Your Answer








    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );














    draft saved

    draft discarded
















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3355435%2fabsolutely-wonderful-numerical-phenomenon-who-can-explain%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    6














    $begingroup$

    EDIT: I saw that you solved it yourself. Congrats! I'm posting this anyway because I was most of the way through typing it when your answer hit. :)



    Infinite products are hard, in general; infinite sums are better, because we have lots of tools at our disposal for handling them. Fortunately, we can always turn a product into a sum via a logarithm.



    Let $X_i sim operatornameUniform(0, r)$, and let $Y_n = prod_i=1^n X_i$. Note that $log(Y_n) = sum_i=1^n log(X_i)$. The eventual emergence of $e$ as important is already somewhat clear, even though we haven't really done anything yet.



    The more useful formulation here is that $fraclog(Y_n)n = frac 1 n sum log(X_i)$, because we know from the Strong Law of Large Numbers that the right side converges almost surely to $mathbb E[log(X_i)]$. We have
    $$mathbb E log(X_i) = int_0^r log(x) cdot frac 1 r , textrm d x = frac 1 r [x log(x) - x] bigg|_0^r = log(r) - 1.$$



    If $r < e$, then $log(Y_n) / n to c < 0$, which implies that $log(Y_n) to -infty$, hence $Y_n to 0$. Similarly, if $r > e$, then $log(Y_n) / n to c > 0$, whence $Y_n to infty$. The fun case is: what happens when $r = e$?






    share|cite|improve this answer











    $endgroup$










    • 1




      $begingroup$
      I accepted your answer, as it is an excellent explanation! Thank you for taking the time!
      $endgroup$
      – Jake Mirra
      8 hours ago










    • $begingroup$
      Was thinking a bit about your question of "what happens when r = e", and all I can say is that, once you look at it on a logarithmic scale, it's a weird, sort of lopsided random walk through the reals where you sometimes take giant steps backwards and then lots of small steps forward.
      $endgroup$
      – Jake Mirra
      7 hours ago










    • $begingroup$
      Yep! And you can convince yourself that even though those increments are unbounded (on the negative side), they still have a finite variance...
      $endgroup$
      – Aaron Montgomery
      7 hours ago










    • $begingroup$
      @AaronMontgomery - what happens when $r=e$? I am not good with the details of probability theory. Does $Y_n$ converge (to $1$) or does it not converge? And what has the finite variance (of $log X_i$) got to do with it? Intuitively I would guess the sequence does not converge, but your mention of finite variance seems to hint that it would...
      $endgroup$
      – antkam
      5 hours ago











    • $begingroup$
      When $r = e$, the fact that we are taking an average of the $log(X_i)$ variables (which have finite variance) means that we can use the Central Limit Theorem to proceed. This implies that $sqrt n overline X$ converges (in distribution only, NOT almost surely) to a normal variable with mean $0$ and variance $sigma^2$ (i.e. the variance of $log(X_i)$), so $log(Y_n)/sqrt n$ does the same. Consequently, $Y_n$ just becomes diffuse, and on individual realizations it will wander, much like an ordinary random walk will do.
      $endgroup$
      – Aaron Montgomery
      5 hours ago















    6














    $begingroup$

    EDIT: I saw that you solved it yourself. Congrats! I'm posting this anyway because I was most of the way through typing it when your answer hit. :)



    Infinite products are hard, in general; infinite sums are better, because we have lots of tools at our disposal for handling them. Fortunately, we can always turn a product into a sum via a logarithm.



    Let $X_i sim operatornameUniform(0, r)$, and let $Y_n = prod_i=1^n X_i$. Note that $log(Y_n) = sum_i=1^n log(X_i)$. The eventual emergence of $e$ as important is already somewhat clear, even though we haven't really done anything yet.



    The more useful formulation here is that $fraclog(Y_n)n = frac 1 n sum log(X_i)$, because we know from the Strong Law of Large Numbers that the right side converges almost surely to $mathbb E[log(X_i)]$. We have
    $$mathbb E log(X_i) = int_0^r log(x) cdot frac 1 r , textrm d x = frac 1 r [x log(x) - x] bigg|_0^r = log(r) - 1.$$



    If $r < e$, then $log(Y_n) / n to c < 0$, which implies that $log(Y_n) to -infty$, hence $Y_n to 0$. Similarly, if $r > e$, then $log(Y_n) / n to c > 0$, whence $Y_n to infty$. The fun case is: what happens when $r = e$?






    share|cite|improve this answer











    $endgroup$










    • 1




      $begingroup$
      I accepted your answer, as it is an excellent explanation! Thank you for taking the time!
      $endgroup$
      – Jake Mirra
      8 hours ago










    • $begingroup$
      Was thinking a bit about your question of "what happens when r = e", and all I can say is that, once you look at it on a logarithmic scale, it's a weird, sort of lopsided random walk through the reals where you sometimes take giant steps backwards and then lots of small steps forward.
      $endgroup$
      – Jake Mirra
      7 hours ago










    • $begingroup$
      Yep! And you can convince yourself that even though those increments are unbounded (on the negative side), they still have a finite variance...
      $endgroup$
      – Aaron Montgomery
      7 hours ago










    • $begingroup$
      @AaronMontgomery - what happens when $r=e$? I am not good with the details of probability theory. Does $Y_n$ converge (to $1$) or does it not converge? And what has the finite variance (of $log X_i$) got to do with it? Intuitively I would guess the sequence does not converge, but your mention of finite variance seems to hint that it would...
      $endgroup$
      – antkam
      5 hours ago











    • $begingroup$
      When $r = e$, the fact that we are taking an average of the $log(X_i)$ variables (which have finite variance) means that we can use the Central Limit Theorem to proceed. This implies that $sqrt n overline X$ converges (in distribution only, NOT almost surely) to a normal variable with mean $0$ and variance $sigma^2$ (i.e. the variance of $log(X_i)$), so $log(Y_n)/sqrt n$ does the same. Consequently, $Y_n$ just becomes diffuse, and on individual realizations it will wander, much like an ordinary random walk will do.
      $endgroup$
      – Aaron Montgomery
      5 hours ago













    6














    6










    6







    $begingroup$

    EDIT: I saw that you solved it yourself. Congrats! I'm posting this anyway because I was most of the way through typing it when your answer hit. :)



    Infinite products are hard, in general; infinite sums are better, because we have lots of tools at our disposal for handling them. Fortunately, we can always turn a product into a sum via a logarithm.



    Let $X_i sim operatornameUniform(0, r)$, and let $Y_n = prod_i=1^n X_i$. Note that $log(Y_n) = sum_i=1^n log(X_i)$. The eventual emergence of $e$ as important is already somewhat clear, even though we haven't really done anything yet.



    The more useful formulation here is that $fraclog(Y_n)n = frac 1 n sum log(X_i)$, because we know from the Strong Law of Large Numbers that the right side converges almost surely to $mathbb E[log(X_i)]$. We have
    $$mathbb E log(X_i) = int_0^r log(x) cdot frac 1 r , textrm d x = frac 1 r [x log(x) - x] bigg|_0^r = log(r) - 1.$$



    If $r < e$, then $log(Y_n) / n to c < 0$, which implies that $log(Y_n) to -infty$, hence $Y_n to 0$. Similarly, if $r > e$, then $log(Y_n) / n to c > 0$, whence $Y_n to infty$. The fun case is: what happens when $r = e$?






    share|cite|improve this answer











    $endgroup$



    EDIT: I saw that you solved it yourself. Congrats! I'm posting this anyway because I was most of the way through typing it when your answer hit. :)



    Infinite products are hard, in general; infinite sums are better, because we have lots of tools at our disposal for handling them. Fortunately, we can always turn a product into a sum via a logarithm.



    Let $X_i sim operatornameUniform(0, r)$, and let $Y_n = prod_i=1^n X_i$. Note that $log(Y_n) = sum_i=1^n log(X_i)$. The eventual emergence of $e$ as important is already somewhat clear, even though we haven't really done anything yet.



    The more useful formulation here is that $fraclog(Y_n)n = frac 1 n sum log(X_i)$, because we know from the Strong Law of Large Numbers that the right side converges almost surely to $mathbb E[log(X_i)]$. We have
    $$mathbb E log(X_i) = int_0^r log(x) cdot frac 1 r , textrm d x = frac 1 r [x log(x) - x] bigg|_0^r = log(r) - 1.$$



    If $r < e$, then $log(Y_n) / n to c < 0$, which implies that $log(Y_n) to -infty$, hence $Y_n to 0$. Similarly, if $r > e$, then $log(Y_n) / n to c > 0$, whence $Y_n to infty$. The fun case is: what happens when $r = e$?







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited 7 hours ago

























    answered 8 hours ago









    Aaron MontgomeryAaron Montgomery

    5,0825 silver badges23 bronze badges




    5,0825 silver badges23 bronze badges










    • 1




      $begingroup$
      I accepted your answer, as it is an excellent explanation! Thank you for taking the time!
      $endgroup$
      – Jake Mirra
      8 hours ago










    • $begingroup$
      Was thinking a bit about your question of "what happens when r = e", and all I can say is that, once you look at it on a logarithmic scale, it's a weird, sort of lopsided random walk through the reals where you sometimes take giant steps backwards and then lots of small steps forward.
      $endgroup$
      – Jake Mirra
      7 hours ago










    • $begingroup$
      Yep! And you can convince yourself that even though those increments are unbounded (on the negative side), they still have a finite variance...
      $endgroup$
      – Aaron Montgomery
      7 hours ago










    • $begingroup$
      @AaronMontgomery - what happens when $r=e$? I am not good with the details of probability theory. Does $Y_n$ converge (to $1$) or does it not converge? And what has the finite variance (of $log X_i$) got to do with it? Intuitively I would guess the sequence does not converge, but your mention of finite variance seems to hint that it would...
      $endgroup$
      – antkam
      5 hours ago











    • $begingroup$
      When $r = e$, the fact that we are taking an average of the $log(X_i)$ variables (which have finite variance) means that we can use the Central Limit Theorem to proceed. This implies that $sqrt n overline X$ converges (in distribution only, NOT almost surely) to a normal variable with mean $0$ and variance $sigma^2$ (i.e. the variance of $log(X_i)$), so $log(Y_n)/sqrt n$ does the same. Consequently, $Y_n$ just becomes diffuse, and on individual realizations it will wander, much like an ordinary random walk will do.
      $endgroup$
      – Aaron Montgomery
      5 hours ago












    • 1




      $begingroup$
      I accepted your answer, as it is an excellent explanation! Thank you for taking the time!
      $endgroup$
      – Jake Mirra
      8 hours ago










    • $begingroup$
      Was thinking a bit about your question of "what happens when r = e", and all I can say is that, once you look at it on a logarithmic scale, it's a weird, sort of lopsided random walk through the reals where you sometimes take giant steps backwards and then lots of small steps forward.
      $endgroup$
      – Jake Mirra
      7 hours ago










    • $begingroup$
      Yep! And you can convince yourself that even though those increments are unbounded (on the negative side), they still have a finite variance...
      $endgroup$
      – Aaron Montgomery
      7 hours ago










    • $begingroup$
      @AaronMontgomery - what happens when $r=e$? I am not good with the details of probability theory. Does $Y_n$ converge (to $1$) or does it not converge? And what has the finite variance (of $log X_i$) got to do with it? Intuitively I would guess the sequence does not converge, but your mention of finite variance seems to hint that it would...
      $endgroup$
      – antkam
      5 hours ago











    • $begingroup$
      When $r = e$, the fact that we are taking an average of the $log(X_i)$ variables (which have finite variance) means that we can use the Central Limit Theorem to proceed. This implies that $sqrt n overline X$ converges (in distribution only, NOT almost surely) to a normal variable with mean $0$ and variance $sigma^2$ (i.e. the variance of $log(X_i)$), so $log(Y_n)/sqrt n$ does the same. Consequently, $Y_n$ just becomes diffuse, and on individual realizations it will wander, much like an ordinary random walk will do.
      $endgroup$
      – Aaron Montgomery
      5 hours ago







    1




    1




    $begingroup$
    I accepted your answer, as it is an excellent explanation! Thank you for taking the time!
    $endgroup$
    – Jake Mirra
    8 hours ago




    $begingroup$
    I accepted your answer, as it is an excellent explanation! Thank you for taking the time!
    $endgroup$
    – Jake Mirra
    8 hours ago












    $begingroup$
    Was thinking a bit about your question of "what happens when r = e", and all I can say is that, once you look at it on a logarithmic scale, it's a weird, sort of lopsided random walk through the reals where you sometimes take giant steps backwards and then lots of small steps forward.
    $endgroup$
    – Jake Mirra
    7 hours ago




    $begingroup$
    Was thinking a bit about your question of "what happens when r = e", and all I can say is that, once you look at it on a logarithmic scale, it's a weird, sort of lopsided random walk through the reals where you sometimes take giant steps backwards and then lots of small steps forward.
    $endgroup$
    – Jake Mirra
    7 hours ago












    $begingroup$
    Yep! And you can convince yourself that even though those increments are unbounded (on the negative side), they still have a finite variance...
    $endgroup$
    – Aaron Montgomery
    7 hours ago




    $begingroup$
    Yep! And you can convince yourself that even though those increments are unbounded (on the negative side), they still have a finite variance...
    $endgroup$
    – Aaron Montgomery
    7 hours ago












    $begingroup$
    @AaronMontgomery - what happens when $r=e$? I am not good with the details of probability theory. Does $Y_n$ converge (to $1$) or does it not converge? And what has the finite variance (of $log X_i$) got to do with it? Intuitively I would guess the sequence does not converge, but your mention of finite variance seems to hint that it would...
    $endgroup$
    – antkam
    5 hours ago





    $begingroup$
    @AaronMontgomery - what happens when $r=e$? I am not good with the details of probability theory. Does $Y_n$ converge (to $1$) or does it not converge? And what has the finite variance (of $log X_i$) got to do with it? Intuitively I would guess the sequence does not converge, but your mention of finite variance seems to hint that it would...
    $endgroup$
    – antkam
    5 hours ago













    $begingroup$
    When $r = e$, the fact that we are taking an average of the $log(X_i)$ variables (which have finite variance) means that we can use the Central Limit Theorem to proceed. This implies that $sqrt n overline X$ converges (in distribution only, NOT almost surely) to a normal variable with mean $0$ and variance $sigma^2$ (i.e. the variance of $log(X_i)$), so $log(Y_n)/sqrt n$ does the same. Consequently, $Y_n$ just becomes diffuse, and on individual realizations it will wander, much like an ordinary random walk will do.
    $endgroup$
    – Aaron Montgomery
    5 hours ago




    $begingroup$
    When $r = e$, the fact that we are taking an average of the $log(X_i)$ variables (which have finite variance) means that we can use the Central Limit Theorem to proceed. This implies that $sqrt n overline X$ converges (in distribution only, NOT almost surely) to a normal variable with mean $0$ and variance $sigma^2$ (i.e. the variance of $log(X_i)$), so $log(Y_n)/sqrt n$ does the same. Consequently, $Y_n$ just becomes diffuse, and on individual realizations it will wander, much like an ordinary random walk will do.
    $endgroup$
    – Aaron Montgomery
    5 hours ago













    5














    $begingroup$

    I found the answer! One starts with the uniform distribution on $ [0,R] $. The natural logarithm pushes this distribution forward to a distribution on $ (-infty, ln(R) ] $ with density function given by $ p(y) = e^y / R, y in (-infty, ln(R)] $. The expected value of this distribution is $ int_-infty^ln(R)cfracy e^yR dy = ln(R) - 1 $. Solving for zero gives the answer to the riddle! Love it!






    share|cite|improve this answer









    $endgroup$



















      5














      $begingroup$

      I found the answer! One starts with the uniform distribution on $ [0,R] $. The natural logarithm pushes this distribution forward to a distribution on $ (-infty, ln(R) ] $ with density function given by $ p(y) = e^y / R, y in (-infty, ln(R)] $. The expected value of this distribution is $ int_-infty^ln(R)cfracy e^yR dy = ln(R) - 1 $. Solving for zero gives the answer to the riddle! Love it!






      share|cite|improve this answer









      $endgroup$

















        5














        5










        5







        $begingroup$

        I found the answer! One starts with the uniform distribution on $ [0,R] $. The natural logarithm pushes this distribution forward to a distribution on $ (-infty, ln(R) ] $ with density function given by $ p(y) = e^y / R, y in (-infty, ln(R)] $. The expected value of this distribution is $ int_-infty^ln(R)cfracy e^yR dy = ln(R) - 1 $. Solving for zero gives the answer to the riddle! Love it!






        share|cite|improve this answer









        $endgroup$



        I found the answer! One starts with the uniform distribution on $ [0,R] $. The natural logarithm pushes this distribution forward to a distribution on $ (-infty, ln(R) ] $ with density function given by $ p(y) = e^y / R, y in (-infty, ln(R)] $. The expected value of this distribution is $ int_-infty^ln(R)cfracy e^yR dy = ln(R) - 1 $. Solving for zero gives the answer to the riddle! Love it!







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 8 hours ago









        Jake MirraJake Mirra

        1789 bronze badges




        1789 bronze badges































            draft saved

            draft discarded















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3355435%2fabsolutely-wonderful-numerical-phenomenon-who-can-explain%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Invision Community Contents History See also References External links Navigation menuProprietaryinvisioncommunity.comIPS Community ForumsIPS Community Forumsthis blog entry"License Changes, IP.Board 3.4, and the Future""Interview -- Matt Mecham of Ibforums""CEO Invision Power Board, Matt Mecham Is a Liar, Thief!"IPB License Explanation 1.3, 1.3.1, 2.0, and 2.1ArchivedSecurity Fixes, Updates And Enhancements For IPB 1.3.1Archived"New Demo Accounts - Invision Power Services"the original"New Default Skin"the original"Invision Power Board 3.0.0 and Applications Released"the original"Archived copy"the original"Perpetual licenses being done away with""Release Notes - Invision Power Services""Introducing: IPS Community Suite 4!"Invision Community Release Notes

            Canceling a color specificationRandomly assigning color to Graphics3D objects?Default color for Filling in Mathematica 9Coloring specific elements of sets with a prime modified order in an array plotHow to pick a color differing significantly from the colors already in a given color list?Detection of the text colorColor numbers based on their valueCan color schemes for use with ColorData include opacity specification?My dynamic color schemes

            Tom Holland Mục lục Đầu đời và giáo dục | Sự nghiệp | Cuộc sống cá nhân | Phim tham gia | Giải thưởng và đề cử | Chú thích | Liên kết ngoài | Trình đơn chuyển hướngProfile“Person Details for Thomas Stanley Holland, "England and Wales Birth Registration Index, 1837-2008" — FamilySearch.org”"Meet Tom Holland... the 16-year-old star of The Impossible""Schoolboy actor Tom Holland finds himself in Oscar contention for role in tsunami drama"“Naomi Watts on the Prince William and Harry's reaction to her film about the late Princess Diana”lưu trữ"Holland and Pflueger Are West End's Two New 'Billy Elliots'""I'm so envious of my son, the movie star! British writer Dominic Holland's spent 20 years trying to crack Hollywood - but he's been beaten to it by a very unlikely rival"“Richard and Margaret Povey of Jersey, Channel Islands, UK: Information about Thomas Stanley Holland”"Tom Holland to play Billy Elliot""New Billy Elliot leaving the garage"Billy Elliot the Musical - Tom Holland - Billy"A Tale of four Billys: Tom Holland""The Feel Good Factor""Thames Christian College schoolboys join Myleene Klass for The Feelgood Factor""Government launches £600,000 arts bursaries pilot""BILLY's Chapman, Holland, Gardner & Jackson-Keen Visit Prime Minister""Elton John 'blown away' by Billy Elliot fifth birthday" (video with John's interview and fragments of Holland's performance)"First News interviews Arrietty's Tom Holland"“33rd Critics' Circle Film Awards winners”“National Board of Review Current Awards”Bản gốc"Ron Howard Whaling Tale 'In The Heart Of The Sea' Casts Tom Holland"“'Spider-Man' Finds Tom Holland to Star as New Web-Slinger”lưu trữ“Captain America: Civil War (2016)”“Film Review: ‘Captain America: Civil War’”lưu trữ“‘Captain America: Civil War’ review: Choose your own avenger”lưu trữ“The Lost City of Z reviews”“Sony Pictures and Marvel Studios Find Their 'Spider-Man' Star and Director”“‘Mary Magdalene’, ‘Current War’ & ‘Wind River’ Get 2017 Release Dates From Weinstein”“Lionsgate Unleashing Daisy Ridley & Tom Holland Starrer ‘Chaos Walking’ In Cannes”“PTA's 'Master' Leads Chicago Film Critics Nominations, UPDATED: Houston and Indiana Critics Nominations”“Nominaciones Goya 2013 Telecinco Cinema – ENG”“Jameson Empire Film Awards: Martin Freeman wins best actor for performance in The Hobbit”“34th Annual Young Artist Awards”Bản gốc“Teen Choice Awards 2016—Captain America: Civil War Leads Second Wave of Nominations”“BAFTA Film Award Nominations: ‘La La Land’ Leads Race”“Saturn Awards Nominations 2017: 'Rogue One,' 'Walking Dead' Lead”Tom HollandTom HollandTom HollandTom Hollandmedia.gettyimages.comWorldCat Identities300279794no20130442900000 0004 0355 42791085670554170004732cb16706349t(data)XX5557367