Is the sample correlation always positively correlated with the sample variance? The Next CEO of Stack OverflowGiven known bivariate normal means and variances, update correlation estimate, $P(rho)$, with new data?Where does the correlation come from in the regression coefficient equation for simple regressionCDF of the ratio of two correlated $chi^2$ random variablesIs there a version of the correlation coefficient that is less-sensitive to outliers?Correlation in Distances of Points Within a Circle from Centre and One Other PointHow do I reproduce this distribution (with observed means, sd, kurtosis, skewness and correlation)?Is the formula of covariance right?Is my Correlation reasoning correct?Variance of $Y|x$ from regression lineIn a bivariate normal sample, why is the squared sample correlation Beta distributed?
Why did the Drakh emissary look so blurred in S04:E11 "Lines of Communication"?
Why doesn't Shulchan Aruch include the laws of destroying fruit trees?
What did the word "leisure" mean in late 18th Century usage?
Car headlights in a world without electricity
Salesforce opportunity stages
Is it okay to majorly distort historical facts while writing a fiction story?
Why does freezing point matter when picking cooler ice packs?
logical reads on global temp table, but not on session-level temp table
Direct Implications Between USA and UK in Event of No-Deal Brexit
Incomplete cube
Man transported from Alternate World into ours by a Neutrino Detector
Would a grinding machine be a simple and workable propulsion system for an interplanetary spacecraft?
How to find if SQL server backup is encrypted with TDE without restoring the backup
"Eavesdropping" vs "Listen in on"
Another proof that dividing by 0 does not exist -- is it right?
Finitely generated matrix groups whose eigenvalues are all algebraic
Strange use of "whether ... than ..." in official text
pgfplots: How to draw a tangent graph below two others?
Cannot restore registry to default in Windows 10?
Traveling with my 5 year old daughter (as the father) without the mother from Germany to Mexico
What day is it again?
How do I secure a TV wall mount?
Is there a rule of thumb for determining the amount one should accept for of a settlement offer?
How can a day be of 24 hours?
Is the sample correlation always positively correlated with the sample variance?
The Next CEO of Stack OverflowGiven known bivariate normal means and variances, update correlation estimate, $P(rho)$, with new data?Where does the correlation come from in the regression coefficient equation for simple regressionCDF of the ratio of two correlated $chi^2$ random variablesIs there a version of the correlation coefficient that is less-sensitive to outliers?Correlation in Distances of Points Within a Circle from Centre and One Other PointHow do I reproduce this distribution (with observed means, sd, kurtosis, skewness and correlation)?Is the formula of covariance right?Is my Correlation reasoning correct?Variance of $Y|x$ from regression lineIn a bivariate normal sample, why is the squared sample correlation Beta distributed?
$begingroup$
The sample correlation $r$ and the sample standard deviation of $X$ (call it $s_X$) seem to be positively correlated if I simulate bivariate normal $X$, $Y$ with a positive true correlation (and seem to be negatively correlated if the true correlation between $X$ and $Y$ is negative). I found this somewhat counterintuitive. Very heuristically, I suppose it reflects the fact that $r$ represents the expected increase in Y (in units of SD(Y)) for a one-SD increase in X, and if we estimate a larger $s_X$, then $r$ reflects the change in Y associated with a larger change in X.
However, I would like to know if $Cov(r, s_x) >0$ for $r>0$ holds in general (at least for the case in which X and Y are bivariate normal and with large n). Letting $sigma$ denote a true SD, we have:
$$Cov(r, s_X) = E [ r s_X] - rho sigma_x$$
$$ approx E Bigg[ fracwidehatCov(X,Y)s_Y Bigg] - fracCov(X,Y)sigma_Y $$
I tried using a Taylor expansion on the first term, but it depends on $Cov(widehatCov(X,Y), s_Y)$, so that’s a dead end. Any ideas?
EDIT
Maybe a better direction would be to try to show that $Cov(widehatbeta, s_X)=0$, where $widehatbeta$ is the OLS coefficient of Y on X. Then we could argue that since $widehatbeta = r fracs_Ys_X$, this implies the desired result. Since $widehatbeta$ is almost like a difference of sample means, maybe we could get the former result using something like the known independence of the sample mean and variance for a normal RV?
correlation covariance independence
$endgroup$
|
show 2 more comments
$begingroup$
The sample correlation $r$ and the sample standard deviation of $X$ (call it $s_X$) seem to be positively correlated if I simulate bivariate normal $X$, $Y$ with a positive true correlation (and seem to be negatively correlated if the true correlation between $X$ and $Y$ is negative). I found this somewhat counterintuitive. Very heuristically, I suppose it reflects the fact that $r$ represents the expected increase in Y (in units of SD(Y)) for a one-SD increase in X, and if we estimate a larger $s_X$, then $r$ reflects the change in Y associated with a larger change in X.
However, I would like to know if $Cov(r, s_x) >0$ for $r>0$ holds in general (at least for the case in which X and Y are bivariate normal and with large n). Letting $sigma$ denote a true SD, we have:
$$Cov(r, s_X) = E [ r s_X] - rho sigma_x$$
$$ approx E Bigg[ fracwidehatCov(X,Y)s_Y Bigg] - fracCov(X,Y)sigma_Y $$
I tried using a Taylor expansion on the first term, but it depends on $Cov(widehatCov(X,Y), s_Y)$, so that’s a dead end. Any ideas?
EDIT
Maybe a better direction would be to try to show that $Cov(widehatbeta, s_X)=0$, where $widehatbeta$ is the OLS coefficient of Y on X. Then we could argue that since $widehatbeta = r fracs_Ys_X$, this implies the desired result. Since $widehatbeta$ is almost like a difference of sample means, maybe we could get the former result using something like the known independence of the sample mean and variance for a normal RV?
correlation covariance independence
$endgroup$
$begingroup$
It would be unchanged. Hmm. I'm afraid I don't yet see the relevance, though.
$endgroup$
– half-pass
4 hours ago
$begingroup$
I should probably also note that while I wish this were a homework question, it's not... :)
$endgroup$
– half-pass
4 hours ago
1
$begingroup$
Ah, I didn't read the question carefully enough. My apologies.
$endgroup$
– jbowman
4 hours ago
$begingroup$
The first equality in your calculation is not correct. $s_x = sqrts^2_x$ is consistent for the standard deviation, but is not unbiased: en.wikipedia.org/wiki/Unbiased_estimation_of_standard_deviation
$endgroup$
– Andrew M
4 hours ago
$begingroup$
It's extremely close to unbiased for large n, though -- the rule-of-thumb correction factor for a normal RV is (n - 1.5) vs. (n-1).
$endgroup$
– half-pass
4 hours ago
|
show 2 more comments
$begingroup$
The sample correlation $r$ and the sample standard deviation of $X$ (call it $s_X$) seem to be positively correlated if I simulate bivariate normal $X$, $Y$ with a positive true correlation (and seem to be negatively correlated if the true correlation between $X$ and $Y$ is negative). I found this somewhat counterintuitive. Very heuristically, I suppose it reflects the fact that $r$ represents the expected increase in Y (in units of SD(Y)) for a one-SD increase in X, and if we estimate a larger $s_X$, then $r$ reflects the change in Y associated with a larger change in X.
However, I would like to know if $Cov(r, s_x) >0$ for $r>0$ holds in general (at least for the case in which X and Y are bivariate normal and with large n). Letting $sigma$ denote a true SD, we have:
$$Cov(r, s_X) = E [ r s_X] - rho sigma_x$$
$$ approx E Bigg[ fracwidehatCov(X,Y)s_Y Bigg] - fracCov(X,Y)sigma_Y $$
I tried using a Taylor expansion on the first term, but it depends on $Cov(widehatCov(X,Y), s_Y)$, so that’s a dead end. Any ideas?
EDIT
Maybe a better direction would be to try to show that $Cov(widehatbeta, s_X)=0$, where $widehatbeta$ is the OLS coefficient of Y on X. Then we could argue that since $widehatbeta = r fracs_Ys_X$, this implies the desired result. Since $widehatbeta$ is almost like a difference of sample means, maybe we could get the former result using something like the known independence of the sample mean and variance for a normal RV?
correlation covariance independence
$endgroup$
The sample correlation $r$ and the sample standard deviation of $X$ (call it $s_X$) seem to be positively correlated if I simulate bivariate normal $X$, $Y$ with a positive true correlation (and seem to be negatively correlated if the true correlation between $X$ and $Y$ is negative). I found this somewhat counterintuitive. Very heuristically, I suppose it reflects the fact that $r$ represents the expected increase in Y (in units of SD(Y)) for a one-SD increase in X, and if we estimate a larger $s_X$, then $r$ reflects the change in Y associated with a larger change in X.
However, I would like to know if $Cov(r, s_x) >0$ for $r>0$ holds in general (at least for the case in which X and Y are bivariate normal and with large n). Letting $sigma$ denote a true SD, we have:
$$Cov(r, s_X) = E [ r s_X] - rho sigma_x$$
$$ approx E Bigg[ fracwidehatCov(X,Y)s_Y Bigg] - fracCov(X,Y)sigma_Y $$
I tried using a Taylor expansion on the first term, but it depends on $Cov(widehatCov(X,Y), s_Y)$, so that’s a dead end. Any ideas?
EDIT
Maybe a better direction would be to try to show that $Cov(widehatbeta, s_X)=0$, where $widehatbeta$ is the OLS coefficient of Y on X. Then we could argue that since $widehatbeta = r fracs_Ys_X$, this implies the desired result. Since $widehatbeta$ is almost like a difference of sample means, maybe we could get the former result using something like the known independence of the sample mean and variance for a normal RV?
correlation covariance independence
correlation covariance independence
edited 3 hours ago
half-pass
asked 5 hours ago
half-passhalf-pass
1,43441931
1,43441931
$begingroup$
It would be unchanged. Hmm. I'm afraid I don't yet see the relevance, though.
$endgroup$
– half-pass
4 hours ago
$begingroup$
I should probably also note that while I wish this were a homework question, it's not... :)
$endgroup$
– half-pass
4 hours ago
1
$begingroup$
Ah, I didn't read the question carefully enough. My apologies.
$endgroup$
– jbowman
4 hours ago
$begingroup$
The first equality in your calculation is not correct. $s_x = sqrts^2_x$ is consistent for the standard deviation, but is not unbiased: en.wikipedia.org/wiki/Unbiased_estimation_of_standard_deviation
$endgroup$
– Andrew M
4 hours ago
$begingroup$
It's extremely close to unbiased for large n, though -- the rule-of-thumb correction factor for a normal RV is (n - 1.5) vs. (n-1).
$endgroup$
– half-pass
4 hours ago
|
show 2 more comments
$begingroup$
It would be unchanged. Hmm. I'm afraid I don't yet see the relevance, though.
$endgroup$
– half-pass
4 hours ago
$begingroup$
I should probably also note that while I wish this were a homework question, it's not... :)
$endgroup$
– half-pass
4 hours ago
1
$begingroup$
Ah, I didn't read the question carefully enough. My apologies.
$endgroup$
– jbowman
4 hours ago
$begingroup$
The first equality in your calculation is not correct. $s_x = sqrts^2_x$ is consistent for the standard deviation, but is not unbiased: en.wikipedia.org/wiki/Unbiased_estimation_of_standard_deviation
$endgroup$
– Andrew M
4 hours ago
$begingroup$
It's extremely close to unbiased for large n, though -- the rule-of-thumb correction factor for a normal RV is (n - 1.5) vs. (n-1).
$endgroup$
– half-pass
4 hours ago
$begingroup$
It would be unchanged. Hmm. I'm afraid I don't yet see the relevance, though.
$endgroup$
– half-pass
4 hours ago
$begingroup$
It would be unchanged. Hmm. I'm afraid I don't yet see the relevance, though.
$endgroup$
– half-pass
4 hours ago
$begingroup$
I should probably also note that while I wish this were a homework question, it's not... :)
$endgroup$
– half-pass
4 hours ago
$begingroup$
I should probably also note that while I wish this were a homework question, it's not... :)
$endgroup$
– half-pass
4 hours ago
1
1
$begingroup$
Ah, I didn't read the question carefully enough. My apologies.
$endgroup$
– jbowman
4 hours ago
$begingroup$
Ah, I didn't read the question carefully enough. My apologies.
$endgroup$
– jbowman
4 hours ago
$begingroup$
The first equality in your calculation is not correct. $s_x = sqrts^2_x$ is consistent for the standard deviation, but is not unbiased: en.wikipedia.org/wiki/Unbiased_estimation_of_standard_deviation
$endgroup$
– Andrew M
4 hours ago
$begingroup$
The first equality in your calculation is not correct. $s_x = sqrts^2_x$ is consistent for the standard deviation, but is not unbiased: en.wikipedia.org/wiki/Unbiased_estimation_of_standard_deviation
$endgroup$
– Andrew M
4 hours ago
$begingroup$
It's extremely close to unbiased for large n, though -- the rule-of-thumb correction factor for a normal RV is (n - 1.5) vs. (n-1).
$endgroup$
– half-pass
4 hours ago
$begingroup$
It's extremely close to unbiased for large n, though -- the rule-of-thumb correction factor for a normal RV is (n - 1.5) vs. (n-1).
$endgroup$
– half-pass
4 hours ago
|
show 2 more comments
2 Answers
2
active
oldest
votes
$begingroup$
It will depend on the joint distribution. For the example you mention, the bivariate (zero-mean) Normal distribution is characterized by the $rho, sigma_x, sigma_y$. It follows that one can have all possible combinations of values of these three parameters, implying that no relation between $rho$ and the standard deviations can be established.
For other bivariate distributions, the correlation coefficient may be fundamentally a function of the standard deviations (essentially both will be functions of more primitive parameters), in which case one can examine whether a monotonic relation exists.
$endgroup$
$begingroup$
I understand that the three parameters can have arbitrary relationships for the BVN distribution, but I don't think it follows that the sample estimates of these are asymptotically independent.
$endgroup$
– half-pass
3 hours ago
add a comment |
$begingroup$
Yes, it does hold asymptotically regardless of the distribution of X and Y. I was on the right track with the Taylor expansion; I just needed to make a symmetry argument:
$endgroup$
add a comment |
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "65"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f400643%2fis-the-sample-correlation-always-positively-correlated-with-the-sample-variance%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
It will depend on the joint distribution. For the example you mention, the bivariate (zero-mean) Normal distribution is characterized by the $rho, sigma_x, sigma_y$. It follows that one can have all possible combinations of values of these three parameters, implying that no relation between $rho$ and the standard deviations can be established.
For other bivariate distributions, the correlation coefficient may be fundamentally a function of the standard deviations (essentially both will be functions of more primitive parameters), in which case one can examine whether a monotonic relation exists.
$endgroup$
$begingroup$
I understand that the three parameters can have arbitrary relationships for the BVN distribution, but I don't think it follows that the sample estimates of these are asymptotically independent.
$endgroup$
– half-pass
3 hours ago
add a comment |
$begingroup$
It will depend on the joint distribution. For the example you mention, the bivariate (zero-mean) Normal distribution is characterized by the $rho, sigma_x, sigma_y$. It follows that one can have all possible combinations of values of these three parameters, implying that no relation between $rho$ and the standard deviations can be established.
For other bivariate distributions, the correlation coefficient may be fundamentally a function of the standard deviations (essentially both will be functions of more primitive parameters), in which case one can examine whether a monotonic relation exists.
$endgroup$
$begingroup$
I understand that the three parameters can have arbitrary relationships for the BVN distribution, but I don't think it follows that the sample estimates of these are asymptotically independent.
$endgroup$
– half-pass
3 hours ago
add a comment |
$begingroup$
It will depend on the joint distribution. For the example you mention, the bivariate (zero-mean) Normal distribution is characterized by the $rho, sigma_x, sigma_y$. It follows that one can have all possible combinations of values of these three parameters, implying that no relation between $rho$ and the standard deviations can be established.
For other bivariate distributions, the correlation coefficient may be fundamentally a function of the standard deviations (essentially both will be functions of more primitive parameters), in which case one can examine whether a monotonic relation exists.
$endgroup$
It will depend on the joint distribution. For the example you mention, the bivariate (zero-mean) Normal distribution is characterized by the $rho, sigma_x, sigma_y$. It follows that one can have all possible combinations of values of these three parameters, implying that no relation between $rho$ and the standard deviations can be established.
For other bivariate distributions, the correlation coefficient may be fundamentally a function of the standard deviations (essentially both will be functions of more primitive parameters), in which case one can examine whether a monotonic relation exists.
answered 3 hours ago
Alecos PapadopoulosAlecos Papadopoulos
42.8k297197
42.8k297197
$begingroup$
I understand that the three parameters can have arbitrary relationships for the BVN distribution, but I don't think it follows that the sample estimates of these are asymptotically independent.
$endgroup$
– half-pass
3 hours ago
add a comment |
$begingroup$
I understand that the three parameters can have arbitrary relationships for the BVN distribution, but I don't think it follows that the sample estimates of these are asymptotically independent.
$endgroup$
– half-pass
3 hours ago
$begingroup$
I understand that the three parameters can have arbitrary relationships for the BVN distribution, but I don't think it follows that the sample estimates of these are asymptotically independent.
$endgroup$
– half-pass
3 hours ago
$begingroup$
I understand that the three parameters can have arbitrary relationships for the BVN distribution, but I don't think it follows that the sample estimates of these are asymptotically independent.
$endgroup$
– half-pass
3 hours ago
add a comment |
$begingroup$
Yes, it does hold asymptotically regardless of the distribution of X and Y. I was on the right track with the Taylor expansion; I just needed to make a symmetry argument:
$endgroup$
add a comment |
$begingroup$
Yes, it does hold asymptotically regardless of the distribution of X and Y. I was on the right track with the Taylor expansion; I just needed to make a symmetry argument:
$endgroup$
add a comment |
$begingroup$
Yes, it does hold asymptotically regardless of the distribution of X and Y. I was on the right track with the Taylor expansion; I just needed to make a symmetry argument:
$endgroup$
Yes, it does hold asymptotically regardless of the distribution of X and Y. I was on the right track with the Taylor expansion; I just needed to make a symmetry argument:
answered 2 hours ago
half-passhalf-pass
1,43441931
1,43441931
add a comment |
add a comment |
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f400643%2fis-the-sample-correlation-always-positively-correlated-with-the-sample-variance%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
It would be unchanged. Hmm. I'm afraid I don't yet see the relevance, though.
$endgroup$
– half-pass
4 hours ago
$begingroup$
I should probably also note that while I wish this were a homework question, it's not... :)
$endgroup$
– half-pass
4 hours ago
1
$begingroup$
Ah, I didn't read the question carefully enough. My apologies.
$endgroup$
– jbowman
4 hours ago
$begingroup$
The first equality in your calculation is not correct. $s_x = sqrts^2_x$ is consistent for the standard deviation, but is not unbiased: en.wikipedia.org/wiki/Unbiased_estimation_of_standard_deviation
$endgroup$
– Andrew M
4 hours ago
$begingroup$
It's extremely close to unbiased for large n, though -- the rule-of-thumb correction factor for a normal RV is (n - 1.5) vs. (n-1).
$endgroup$
– half-pass
4 hours ago