Are there any differences in causality between linear and logistic regression?Logistic regression and ordinal independent variablesDifferences between logistic regression and perceptronsUnder what conditions does correlation imply proximity to causation?Does rejection of null hypothesis in multiple regression entail causation?Why can multicollinearity be a problem for logistic regression?The Book of Why by Judea Pearl: Why is he bashing statistics?
Is The Lion King live action film made in motion capture?
Word or idiom defining something barely functional
Why does this Pokémon I just hatched need to be healed?
Best gun to modify into a monsterhunter weapon?
Tikzcd pullback square issue
Improve survivability of bicycle container
Why does Intel's Haswell chip allow multiplication to be twice as fast as addition?
Double blind peer review when paper cites author's GitHub repo for code
Is it true that control+alt+delete only became a thing because IBM would not build Bill Gates a computer with a task manager button?
Replace data between quotes in a file
Why couldn't soldiers sight their own weapons without officers' orders?
Is multiplication of real numbers uniquely defined as being distributive over addition?
How to say "fit" in Latin?
How do I calculate the difference in lens reach between a superzoom compact and a DSLR zoom lens?
What is the best way to cause swarm intelligence to be destroyed?
Physics of Guitar frets and sound
What are good ways to improve as a writer other than writing courses?
How many numbers in the matrix?
Is refreshing multiple times a test case for web applications?
During the Space Shuttle Columbia Disaster of 2003, Why Did The Flight Director Say, "Lock the doors."?
Can an SPI slave start a transmission in full-duplex mode?
Shabbat clothing on shabbat chazon
SQL Minimum Row count
Improving software when the author can see no need for improvement
Are there any differences in causality between linear and logistic regression?
Logistic regression and ordinal independent variablesDifferences between logistic regression and perceptronsUnder what conditions does correlation imply proximity to causation?Does rejection of null hypothesis in multiple regression entail causation?Why can multicollinearity be a problem for logistic regression?The Book of Why by Judea Pearl: Why is he bashing statistics?
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
$begingroup$
I'm guessing this is a pretty basic question, but I am having a hard time wrapping my head around it.
So my understanding with linear regression, is that it shows how much a change in X, will cause a change in Y. And the same with multiple linear regression.
But can the same be said about logistic regression? What if both of the variables are nominal? Can you do logistic regression this way?
I am currently running an ordinal variable against a nominal one, and I get similar results when I alternate independent vs dependent.
So, my question is should logistic regression be viewed as explaining causal relationships as we do with linear regression? Or is it possible that causality is overwritten by multicollinearity? Leaving us with strictly correlational inferences?
Thanks
logistic causality
New contributor
$endgroup$
add a comment |
$begingroup$
I'm guessing this is a pretty basic question, but I am having a hard time wrapping my head around it.
So my understanding with linear regression, is that it shows how much a change in X, will cause a change in Y. And the same with multiple linear regression.
But can the same be said about logistic regression? What if both of the variables are nominal? Can you do logistic regression this way?
I am currently running an ordinal variable against a nominal one, and I get similar results when I alternate independent vs dependent.
So, my question is should logistic regression be viewed as explaining causal relationships as we do with linear regression? Or is it possible that causality is overwritten by multicollinearity? Leaving us with strictly correlational inferences?
Thanks
logistic causality
New contributor
$endgroup$
5
$begingroup$
Linear regression is not causal in the usual sense of causal
$endgroup$
– Henry
8 hours ago
add a comment |
$begingroup$
I'm guessing this is a pretty basic question, but I am having a hard time wrapping my head around it.
So my understanding with linear regression, is that it shows how much a change in X, will cause a change in Y. And the same with multiple linear regression.
But can the same be said about logistic regression? What if both of the variables are nominal? Can you do logistic regression this way?
I am currently running an ordinal variable against a nominal one, and I get similar results when I alternate independent vs dependent.
So, my question is should logistic regression be viewed as explaining causal relationships as we do with linear regression? Or is it possible that causality is overwritten by multicollinearity? Leaving us with strictly correlational inferences?
Thanks
logistic causality
New contributor
$endgroup$
I'm guessing this is a pretty basic question, but I am having a hard time wrapping my head around it.
So my understanding with linear regression, is that it shows how much a change in X, will cause a change in Y. And the same with multiple linear regression.
But can the same be said about logistic regression? What if both of the variables are nominal? Can you do logistic regression this way?
I am currently running an ordinal variable against a nominal one, and I get similar results when I alternate independent vs dependent.
So, my question is should logistic regression be viewed as explaining causal relationships as we do with linear regression? Or is it possible that causality is overwritten by multicollinearity? Leaving us with strictly correlational inferences?
Thanks
logistic causality
logistic causality
New contributor
New contributor
New contributor
asked 8 hours ago
Jake StidhamJake Stidham
61 bronze badge
61 bronze badge
New contributor
New contributor
5
$begingroup$
Linear regression is not causal in the usual sense of causal
$endgroup$
– Henry
8 hours ago
add a comment |
5
$begingroup$
Linear regression is not causal in the usual sense of causal
$endgroup$
– Henry
8 hours ago
5
5
$begingroup$
Linear regression is not causal in the usual sense of causal
$endgroup$
– Henry
8 hours ago
$begingroup$
Linear regression is not causal in the usual sense of causal
$endgroup$
– Henry
8 hours ago
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
Causality has nothing to do with regression. You can regress any variables that are not causally linked. Better way of thinking of regression is "response of Y to X", or "relationship of Y and X". And in this regard it does not matter if the link function is identical (as in the normal regression) or logit function (as in the logistic regression). In logistic regression the response of Y to X will just have different shape due to the logit link function than it would have in normal regression.
So the answer is no, there are no differences in causality.
$endgroup$
add a comment |
$begingroup$
Neither of them establish causality - unless we talk about specific experimental set-ups (e.g. randomized studies). They show correlation. E.g. a linear model for maximum daytime temperature with the ice cream sales as a predictor cannot show that selling ice cream causes higher temperature, even if higher ice cream sales correlate with higher temperatures. Of course, I could come up with a whole convoluted story of how this works (e.g. all those freezers increase the outdoors temperature in order to produce ice cream), but that would all be post-hoc reasoning. In my example, we can of course immediately tell that I probably got my causality the wrong way around, but in many practical examples that is a lot less clear.
Just the same thing is happening, if I do a logistic regression for "Was it a very hot day" (yes/no).
$endgroup$
add a comment |
$begingroup$
A causal relationship is defined by a structural model that relates an outcome to its causes. Regression is one way of estimating the parameters of the structural causal model (there are other ways). If the structural model takes the form of a logistic regression model, then a logistic regression model is one way of recovering the true causal parameter. If the structural model takes the form of a linear regression model, then a linear regression model is one way of recovering the true causal parameter. With a binary outcome, a logistic regression model makes more sense because the the output of a logistic regression model can be interpreted as a probability, and part of the outcome-generating process may involve drawing a 0 or 1 with a given probability. With a continuous outcome, a linear regression may make more sense. The choice of model you use should depend on the form of the structural causal model you are trying to approximate. (Note that both linear and logistic regression can be used for both binary and continuous outcomes.)
What many of the posters here are arguing is that regression (logistic or linear) are not inherently causal methods; they can be used to extract a causal parameter from data, but they can be used for other purposes, too. My view is that as long as the predictor precedes the outcome temporally, regression is inherently causal, regardless of your study design. The difference is that the parameter you may estimate from your model may be a biased estimate of the causal parameter. The degree of bias depends on qualities of your design (e.g., whether you have randomized your treatment, whether you are implicitly conditioning on a consequent of treatment, whether you have collected enough variables to remove confounding). Including covariates in a linear or logistic regression is one way to attempt to remove the bias of a causal effect estimate.
When the structural causal model is perfectly reproduced by the model you specify, the causal parameter of interest will be estimated without bias. This tends not to be the case, and so estimating causal parameters using regression can leave you with a biased estimate (if bias is induced by your design). It is for this reason that other answers have been strong in claiming regression is not an inherently causal method; in a bias-inducing design, it's almost impossible to specify a regression model that correctly reproduces the structural causal model that underlies the data.
A final note: people often say correlation does not imply causation, but correlation without confounding does imply causation. Regression is one way to remove confounding. The effectiveness of logistic regression or linear regression at doing so depends on generally unknown qualities of the data-generating structural causal model.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "65"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Jake Stidham is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f421479%2fare-there-any-differences-in-causality-between-linear-and-logistic-regression%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Causality has nothing to do with regression. You can regress any variables that are not causally linked. Better way of thinking of regression is "response of Y to X", or "relationship of Y and X". And in this regard it does not matter if the link function is identical (as in the normal regression) or logit function (as in the logistic regression). In logistic regression the response of Y to X will just have different shape due to the logit link function than it would have in normal regression.
So the answer is no, there are no differences in causality.
$endgroup$
add a comment |
$begingroup$
Causality has nothing to do with regression. You can regress any variables that are not causally linked. Better way of thinking of regression is "response of Y to X", or "relationship of Y and X". And in this regard it does not matter if the link function is identical (as in the normal regression) or logit function (as in the logistic regression). In logistic regression the response of Y to X will just have different shape due to the logit link function than it would have in normal regression.
So the answer is no, there are no differences in causality.
$endgroup$
add a comment |
$begingroup$
Causality has nothing to do with regression. You can regress any variables that are not causally linked. Better way of thinking of regression is "response of Y to X", or "relationship of Y and X". And in this regard it does not matter if the link function is identical (as in the normal regression) or logit function (as in the logistic regression). In logistic regression the response of Y to X will just have different shape due to the logit link function than it would have in normal regression.
So the answer is no, there are no differences in causality.
$endgroup$
Causality has nothing to do with regression. You can regress any variables that are not causally linked. Better way of thinking of regression is "response of Y to X", or "relationship of Y and X". And in this regard it does not matter if the link function is identical (as in the normal regression) or logit function (as in the logistic regression). In logistic regression the response of Y to X will just have different shape due to the logit link function than it would have in normal regression.
So the answer is no, there are no differences in causality.
edited 6 hours ago
answered 6 hours ago
CuriousCurious
3,1186 gold badges43 silver badges78 bronze badges
3,1186 gold badges43 silver badges78 bronze badges
add a comment |
add a comment |
$begingroup$
Neither of them establish causality - unless we talk about specific experimental set-ups (e.g. randomized studies). They show correlation. E.g. a linear model for maximum daytime temperature with the ice cream sales as a predictor cannot show that selling ice cream causes higher temperature, even if higher ice cream sales correlate with higher temperatures. Of course, I could come up with a whole convoluted story of how this works (e.g. all those freezers increase the outdoors temperature in order to produce ice cream), but that would all be post-hoc reasoning. In my example, we can of course immediately tell that I probably got my causality the wrong way around, but in many practical examples that is a lot less clear.
Just the same thing is happening, if I do a logistic regression for "Was it a very hot day" (yes/no).
$endgroup$
add a comment |
$begingroup$
Neither of them establish causality - unless we talk about specific experimental set-ups (e.g. randomized studies). They show correlation. E.g. a linear model for maximum daytime temperature with the ice cream sales as a predictor cannot show that selling ice cream causes higher temperature, even if higher ice cream sales correlate with higher temperatures. Of course, I could come up with a whole convoluted story of how this works (e.g. all those freezers increase the outdoors temperature in order to produce ice cream), but that would all be post-hoc reasoning. In my example, we can of course immediately tell that I probably got my causality the wrong way around, but in many practical examples that is a lot less clear.
Just the same thing is happening, if I do a logistic regression for "Was it a very hot day" (yes/no).
$endgroup$
add a comment |
$begingroup$
Neither of them establish causality - unless we talk about specific experimental set-ups (e.g. randomized studies). They show correlation. E.g. a linear model for maximum daytime temperature with the ice cream sales as a predictor cannot show that selling ice cream causes higher temperature, even if higher ice cream sales correlate with higher temperatures. Of course, I could come up with a whole convoluted story of how this works (e.g. all those freezers increase the outdoors temperature in order to produce ice cream), but that would all be post-hoc reasoning. In my example, we can of course immediately tell that I probably got my causality the wrong way around, but in many practical examples that is a lot less clear.
Just the same thing is happening, if I do a logistic regression for "Was it a very hot day" (yes/no).
$endgroup$
Neither of them establish causality - unless we talk about specific experimental set-ups (e.g. randomized studies). They show correlation. E.g. a linear model for maximum daytime temperature with the ice cream sales as a predictor cannot show that selling ice cream causes higher temperature, even if higher ice cream sales correlate with higher temperatures. Of course, I could come up with a whole convoluted story of how this works (e.g. all those freezers increase the outdoors temperature in order to produce ice cream), but that would all be post-hoc reasoning. In my example, we can of course immediately tell that I probably got my causality the wrong way around, but in many practical examples that is a lot less clear.
Just the same thing is happening, if I do a logistic regression for "Was it a very hot day" (yes/no).
answered 6 hours ago
BjörnBjörn
12.4k2 gold badges11 silver badges45 bronze badges
12.4k2 gold badges11 silver badges45 bronze badges
add a comment |
add a comment |
$begingroup$
A causal relationship is defined by a structural model that relates an outcome to its causes. Regression is one way of estimating the parameters of the structural causal model (there are other ways). If the structural model takes the form of a logistic regression model, then a logistic regression model is one way of recovering the true causal parameter. If the structural model takes the form of a linear regression model, then a linear regression model is one way of recovering the true causal parameter. With a binary outcome, a logistic regression model makes more sense because the the output of a logistic regression model can be interpreted as a probability, and part of the outcome-generating process may involve drawing a 0 or 1 with a given probability. With a continuous outcome, a linear regression may make more sense. The choice of model you use should depend on the form of the structural causal model you are trying to approximate. (Note that both linear and logistic regression can be used for both binary and continuous outcomes.)
What many of the posters here are arguing is that regression (logistic or linear) are not inherently causal methods; they can be used to extract a causal parameter from data, but they can be used for other purposes, too. My view is that as long as the predictor precedes the outcome temporally, regression is inherently causal, regardless of your study design. The difference is that the parameter you may estimate from your model may be a biased estimate of the causal parameter. The degree of bias depends on qualities of your design (e.g., whether you have randomized your treatment, whether you are implicitly conditioning on a consequent of treatment, whether you have collected enough variables to remove confounding). Including covariates in a linear or logistic regression is one way to attempt to remove the bias of a causal effect estimate.
When the structural causal model is perfectly reproduced by the model you specify, the causal parameter of interest will be estimated without bias. This tends not to be the case, and so estimating causal parameters using regression can leave you with a biased estimate (if bias is induced by your design). It is for this reason that other answers have been strong in claiming regression is not an inherently causal method; in a bias-inducing design, it's almost impossible to specify a regression model that correctly reproduces the structural causal model that underlies the data.
A final note: people often say correlation does not imply causation, but correlation without confounding does imply causation. Regression is one way to remove confounding. The effectiveness of logistic regression or linear regression at doing so depends on generally unknown qualities of the data-generating structural causal model.
$endgroup$
add a comment |
$begingroup$
A causal relationship is defined by a structural model that relates an outcome to its causes. Regression is one way of estimating the parameters of the structural causal model (there are other ways). If the structural model takes the form of a logistic regression model, then a logistic regression model is one way of recovering the true causal parameter. If the structural model takes the form of a linear regression model, then a linear regression model is one way of recovering the true causal parameter. With a binary outcome, a logistic regression model makes more sense because the the output of a logistic regression model can be interpreted as a probability, and part of the outcome-generating process may involve drawing a 0 or 1 with a given probability. With a continuous outcome, a linear regression may make more sense. The choice of model you use should depend on the form of the structural causal model you are trying to approximate. (Note that both linear and logistic regression can be used for both binary and continuous outcomes.)
What many of the posters here are arguing is that regression (logistic or linear) are not inherently causal methods; they can be used to extract a causal parameter from data, but they can be used for other purposes, too. My view is that as long as the predictor precedes the outcome temporally, regression is inherently causal, regardless of your study design. The difference is that the parameter you may estimate from your model may be a biased estimate of the causal parameter. The degree of bias depends on qualities of your design (e.g., whether you have randomized your treatment, whether you are implicitly conditioning on a consequent of treatment, whether you have collected enough variables to remove confounding). Including covariates in a linear or logistic regression is one way to attempt to remove the bias of a causal effect estimate.
When the structural causal model is perfectly reproduced by the model you specify, the causal parameter of interest will be estimated without bias. This tends not to be the case, and so estimating causal parameters using regression can leave you with a biased estimate (if bias is induced by your design). It is for this reason that other answers have been strong in claiming regression is not an inherently causal method; in a bias-inducing design, it's almost impossible to specify a regression model that correctly reproduces the structural causal model that underlies the data.
A final note: people often say correlation does not imply causation, but correlation without confounding does imply causation. Regression is one way to remove confounding. The effectiveness of logistic regression or linear regression at doing so depends on generally unknown qualities of the data-generating structural causal model.
$endgroup$
add a comment |
$begingroup$
A causal relationship is defined by a structural model that relates an outcome to its causes. Regression is one way of estimating the parameters of the structural causal model (there are other ways). If the structural model takes the form of a logistic regression model, then a logistic regression model is one way of recovering the true causal parameter. If the structural model takes the form of a linear regression model, then a linear regression model is one way of recovering the true causal parameter. With a binary outcome, a logistic regression model makes more sense because the the output of a logistic regression model can be interpreted as a probability, and part of the outcome-generating process may involve drawing a 0 or 1 with a given probability. With a continuous outcome, a linear regression may make more sense. The choice of model you use should depend on the form of the structural causal model you are trying to approximate. (Note that both linear and logistic regression can be used for both binary and continuous outcomes.)
What many of the posters here are arguing is that regression (logistic or linear) are not inherently causal methods; they can be used to extract a causal parameter from data, but they can be used for other purposes, too. My view is that as long as the predictor precedes the outcome temporally, regression is inherently causal, regardless of your study design. The difference is that the parameter you may estimate from your model may be a biased estimate of the causal parameter. The degree of bias depends on qualities of your design (e.g., whether you have randomized your treatment, whether you are implicitly conditioning on a consequent of treatment, whether you have collected enough variables to remove confounding). Including covariates in a linear or logistic regression is one way to attempt to remove the bias of a causal effect estimate.
When the structural causal model is perfectly reproduced by the model you specify, the causal parameter of interest will be estimated without bias. This tends not to be the case, and so estimating causal parameters using regression can leave you with a biased estimate (if bias is induced by your design). It is for this reason that other answers have been strong in claiming regression is not an inherently causal method; in a bias-inducing design, it's almost impossible to specify a regression model that correctly reproduces the structural causal model that underlies the data.
A final note: people often say correlation does not imply causation, but correlation without confounding does imply causation. Regression is one way to remove confounding. The effectiveness of logistic regression or linear regression at doing so depends on generally unknown qualities of the data-generating structural causal model.
$endgroup$
A causal relationship is defined by a structural model that relates an outcome to its causes. Regression is one way of estimating the parameters of the structural causal model (there are other ways). If the structural model takes the form of a logistic regression model, then a logistic regression model is one way of recovering the true causal parameter. If the structural model takes the form of a linear regression model, then a linear regression model is one way of recovering the true causal parameter. With a binary outcome, a logistic regression model makes more sense because the the output of a logistic regression model can be interpreted as a probability, and part of the outcome-generating process may involve drawing a 0 or 1 with a given probability. With a continuous outcome, a linear regression may make more sense. The choice of model you use should depend on the form of the structural causal model you are trying to approximate. (Note that both linear and logistic regression can be used for both binary and continuous outcomes.)
What many of the posters here are arguing is that regression (logistic or linear) are not inherently causal methods; they can be used to extract a causal parameter from data, but they can be used for other purposes, too. My view is that as long as the predictor precedes the outcome temporally, regression is inherently causal, regardless of your study design. The difference is that the parameter you may estimate from your model may be a biased estimate of the causal parameter. The degree of bias depends on qualities of your design (e.g., whether you have randomized your treatment, whether you are implicitly conditioning on a consequent of treatment, whether you have collected enough variables to remove confounding). Including covariates in a linear or logistic regression is one way to attempt to remove the bias of a causal effect estimate.
When the structural causal model is perfectly reproduced by the model you specify, the causal parameter of interest will be estimated without bias. This tends not to be the case, and so estimating causal parameters using regression can leave you with a biased estimate (if bias is induced by your design). It is for this reason that other answers have been strong in claiming regression is not an inherently causal method; in a bias-inducing design, it's almost impossible to specify a regression model that correctly reproduces the structural causal model that underlies the data.
A final note: people often say correlation does not imply causation, but correlation without confounding does imply causation. Regression is one way to remove confounding. The effectiveness of logistic regression or linear regression at doing so depends on generally unknown qualities of the data-generating structural causal model.
answered 3 hours ago
NoahNoah
5,5141 gold badge5 silver badges20 bronze badges
5,5141 gold badge5 silver badges20 bronze badges
add a comment |
add a comment |
Jake Stidham is a new contributor. Be nice, and check out our Code of Conduct.
Jake Stidham is a new contributor. Be nice, and check out our Code of Conduct.
Jake Stidham is a new contributor. Be nice, and check out our Code of Conduct.
Jake Stidham is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f421479%2fare-there-any-differences-in-causality-between-linear-and-logistic-regression%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
5
$begingroup$
Linear regression is not causal in the usual sense of causal
$endgroup$
– Henry
8 hours ago