Binomial to Poisson Approximation : Why does p have to be smallHow Binomial and Normal distributions approximate Poisson distribution respectively?How to choose between poisson and binomial distributionsPoisson Approximation of BinomialWhy does $p$ have to be moderate in the Poisson approximation to binomial random variable?How to prove Poisson Distribution is the approximation of Binomial Distribution?Simple Question on interpretation of Poisson dist. as approximation to Binomial dist.Did computers render useless the teaching of approximating the Binomial with Poisson and Normal distribution?Probability: $Ysim P(lambda) approx B sim (n,p)$ for $n$ large, $p$ small enough.Why do probabilities have to be small in poisson distribution?Connection between the Binomial distribution, Poisson distribution and Normal distribution
Why do some employees fill out a W-4 and some don't?
Why can my keyboard only digest 6 keypresses at a time?
Pre-1972 sci-fi short story or novel: alien(?) tunnel where people try new moves and get destroyed if they're not the correct ones
How to communicate to my GM that not being allowed to use stealth isn't fun for me?
Giant Steps - Coltrane and Slonimsky
Is a lack of character descriptions a problem?
What's up with this leaf?
Which languages would be most useful in Europe at the end of the 19th century?
Is White controlling this game?
How come the nude protesters were not arrested?
Are there any important biographies of nobodies?
How to hide an urban landmark?
Union with anonymous struct with flexible array member
Winning Strategy for the Magician and his Apprentice
Soft question: Examples where lack of mathematical rigour cause security breaches?
How to handle self harm scars on the arm in work environment?
You have (3^2 + 2^3 + 2^2) Guesses Left. Figure out the Last one
How to manually rewind film?
Has there been a multiethnic Star Trek character?
How do I prevent employees from either switching to competitors or opening their own business?
Certain search in list
Overlapping String-Blocks
What is the actual quality of machine translations?
Fixing obscure 8080 emulator bug?
Binomial to Poisson Approximation : Why does p have to be small
How Binomial and Normal distributions approximate Poisson distribution respectively?How to choose between poisson and binomial distributionsPoisson Approximation of BinomialWhy does $p$ have to be moderate in the Poisson approximation to binomial random variable?How to prove Poisson Distribution is the approximation of Binomial Distribution?Simple Question on interpretation of Poisson dist. as approximation to Binomial dist.Did computers render useless the teaching of approximating the Binomial with Poisson and Normal distribution?Probability: $Ysim P(lambda) approx B sim (n,p)$ for $n$ large, $p$ small enough.Why do probabilities have to be small in poisson distribution?Connection between the Binomial distribution, Poisson distribution and Normal distribution
$begingroup$
I understandd that as n tends towards to infinity for a Binomial distrobution, it becomes a Poisson distobution and i have completed the proof for this.
However, I am not sure why when approximating, p has to be a relativley small value. Again, i understand why n must be large, but whats the purpose/ proof that the smaller p is, the better the approximation is.
Essentially waht I'm asking is, why does p have to be small when approximamting a Poisson from a binomial.
Could someone please help explain this
Thanks
probability-distributions approximation poisson-distribution binomial-distribution
New contributor
$endgroup$
add a comment |
$begingroup$
I understandd that as n tends towards to infinity for a Binomial distrobution, it becomes a Poisson distobution and i have completed the proof for this.
However, I am not sure why when approximating, p has to be a relativley small value. Again, i understand why n must be large, but whats the purpose/ proof that the smaller p is, the better the approximation is.
Essentially waht I'm asking is, why does p have to be small when approximamting a Poisson from a binomial.
Could someone please help explain this
Thanks
probability-distributions approximation poisson-distribution binomial-distribution
New contributor
$endgroup$
add a comment |
$begingroup$
I understandd that as n tends towards to infinity for a Binomial distrobution, it becomes a Poisson distobution and i have completed the proof for this.
However, I am not sure why when approximating, p has to be a relativley small value. Again, i understand why n must be large, but whats the purpose/ proof that the smaller p is, the better the approximation is.
Essentially waht I'm asking is, why does p have to be small when approximamting a Poisson from a binomial.
Could someone please help explain this
Thanks
probability-distributions approximation poisson-distribution binomial-distribution
New contributor
$endgroup$
I understandd that as n tends towards to infinity for a Binomial distrobution, it becomes a Poisson distobution and i have completed the proof for this.
However, I am not sure why when approximating, p has to be a relativley small value. Again, i understand why n must be large, but whats the purpose/ proof that the smaller p is, the better the approximation is.
Essentially waht I'm asking is, why does p have to be small when approximamting a Poisson from a binomial.
Could someone please help explain this
Thanks
probability-distributions approximation poisson-distribution binomial-distribution
probability-distributions approximation poisson-distribution binomial-distribution
New contributor
New contributor
edited 7 hours ago
Ibrahim
New contributor
asked 8 hours ago
IbrahimIbrahim
162
162
New contributor
New contributor
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
In the proof, you need $npto lambda$. But if $n$ grows large and $nptolambda$, then we must have $pto 0$, or else $nptoinfty$.
$endgroup$
add a comment |
$begingroup$
Intuitively, for approximating $operatornameBinomial(n,p)approxoperatornamePoisson(lambda=np)$, the smaller $p$ is, the closer the variance $npq=lambda(1-p)$ to $lambda$, so you expect better approximation.
In the proof, you use
$$
fracn(n-1)dots(n-k+1)n^kleft(1-fraclambdanright)^n-kapprox e^-lambda
$$
to show
$$
binomnkp^k(1-p)^n-kapproxfrace^-np(np)^kk!
$$
If you analyse the error terms more carefully, you get some explicit bounds such as
$$
sum_k=0^infty
leftlvert
binomnkp^k(1-p)^n-k-frace^-np(np)^kk!
rightrvertleq Cp
$$
where $Cleq 4$. So this justifies the motto "smaller $p$ gives better approximation".
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Ibrahim is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3252245%2fbinomial-to-poisson-approximation-why-does-p-have-to-be-small%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
In the proof, you need $npto lambda$. But if $n$ grows large and $nptolambda$, then we must have $pto 0$, or else $nptoinfty$.
$endgroup$
add a comment |
$begingroup$
In the proof, you need $npto lambda$. But if $n$ grows large and $nptolambda$, then we must have $pto 0$, or else $nptoinfty$.
$endgroup$
add a comment |
$begingroup$
In the proof, you need $npto lambda$. But if $n$ grows large and $nptolambda$, then we must have $pto 0$, or else $nptoinfty$.
$endgroup$
In the proof, you need $npto lambda$. But if $n$ grows large and $nptolambda$, then we must have $pto 0$, or else $nptoinfty$.
answered 7 hours ago
FakeAnalyst56FakeAnalyst56
7517
7517
add a comment |
add a comment |
$begingroup$
Intuitively, for approximating $operatornameBinomial(n,p)approxoperatornamePoisson(lambda=np)$, the smaller $p$ is, the closer the variance $npq=lambda(1-p)$ to $lambda$, so you expect better approximation.
In the proof, you use
$$
fracn(n-1)dots(n-k+1)n^kleft(1-fraclambdanright)^n-kapprox e^-lambda
$$
to show
$$
binomnkp^k(1-p)^n-kapproxfrace^-np(np)^kk!
$$
If you analyse the error terms more carefully, you get some explicit bounds such as
$$
sum_k=0^infty
leftlvert
binomnkp^k(1-p)^n-k-frace^-np(np)^kk!
rightrvertleq Cp
$$
where $Cleq 4$. So this justifies the motto "smaller $p$ gives better approximation".
$endgroup$
add a comment |
$begingroup$
Intuitively, for approximating $operatornameBinomial(n,p)approxoperatornamePoisson(lambda=np)$, the smaller $p$ is, the closer the variance $npq=lambda(1-p)$ to $lambda$, so you expect better approximation.
In the proof, you use
$$
fracn(n-1)dots(n-k+1)n^kleft(1-fraclambdanright)^n-kapprox e^-lambda
$$
to show
$$
binomnkp^k(1-p)^n-kapproxfrace^-np(np)^kk!
$$
If you analyse the error terms more carefully, you get some explicit bounds such as
$$
sum_k=0^infty
leftlvert
binomnkp^k(1-p)^n-k-frace^-np(np)^kk!
rightrvertleq Cp
$$
where $Cleq 4$. So this justifies the motto "smaller $p$ gives better approximation".
$endgroup$
add a comment |
$begingroup$
Intuitively, for approximating $operatornameBinomial(n,p)approxoperatornamePoisson(lambda=np)$, the smaller $p$ is, the closer the variance $npq=lambda(1-p)$ to $lambda$, so you expect better approximation.
In the proof, you use
$$
fracn(n-1)dots(n-k+1)n^kleft(1-fraclambdanright)^n-kapprox e^-lambda
$$
to show
$$
binomnkp^k(1-p)^n-kapproxfrace^-np(np)^kk!
$$
If you analyse the error terms more carefully, you get some explicit bounds such as
$$
sum_k=0^infty
leftlvert
binomnkp^k(1-p)^n-k-frace^-np(np)^kk!
rightrvertleq Cp
$$
where $Cleq 4$. So this justifies the motto "smaller $p$ gives better approximation".
$endgroup$
Intuitively, for approximating $operatornameBinomial(n,p)approxoperatornamePoisson(lambda=np)$, the smaller $p$ is, the closer the variance $npq=lambda(1-p)$ to $lambda$, so you expect better approximation.
In the proof, you use
$$
fracn(n-1)dots(n-k+1)n^kleft(1-fraclambdanright)^n-kapprox e^-lambda
$$
to show
$$
binomnkp^k(1-p)^n-kapproxfrace^-np(np)^kk!
$$
If you analyse the error terms more carefully, you get some explicit bounds such as
$$
sum_k=0^infty
leftlvert
binomnkp^k(1-p)^n-k-frace^-np(np)^kk!
rightrvertleq Cp
$$
where $Cleq 4$. So this justifies the motto "smaller $p$ gives better approximation".
answered 6 hours ago
user10354138user10354138
13k21125
13k21125
add a comment |
add a comment |
Ibrahim is a new contributor. Be nice, and check out our Code of Conduct.
Ibrahim is a new contributor. Be nice, and check out our Code of Conduct.
Ibrahim is a new contributor. Be nice, and check out our Code of Conduct.
Ibrahim is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3252245%2fbinomial-to-poisson-approximation-why-does-p-have-to-be-small%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown