Are there any differences in causality between linear and logistic regression?Logistic regression and ordinal independent variablesDifferences between logistic regression and perceptronsUnder what conditions does correlation imply proximity to causation?Does rejection of null hypothesis in multiple regression entail causation?Why can multicollinearity be a problem for logistic regression?The Book of Why by Judea Pearl: Why is he bashing statistics?

Is The Lion King live action film made in motion capture?

Word or idiom defining something barely functional

Why does this Pokémon I just hatched need to be healed?

Best gun to modify into a monsterhunter weapon?

Tikzcd pullback square issue

Improve survivability of bicycle container

Why does Intel's Haswell chip allow multiplication to be twice as fast as addition?

Double blind peer review when paper cites author's GitHub repo for code

Is it true that control+alt+delete only became a thing because IBM would not build Bill Gates a computer with a task manager button?

Replace data between quotes in a file

Why couldn't soldiers sight their own weapons without officers' orders?

Is multiplication of real numbers uniquely defined as being distributive over addition?

How to say "fit" in Latin?

How do I calculate the difference in lens reach between a superzoom compact and a DSLR zoom lens?

What is the best way to cause swarm intelligence to be destroyed?

Physics of Guitar frets and sound

What are good ways to improve as a writer other than writing courses?

How many numbers in the matrix?

Is refreshing multiple times a test case for web applications?

During the Space Shuttle Columbia Disaster of 2003, Why Did The Flight Director Say, "Lock the doors."?

Can an SPI slave start a transmission in full-duplex mode?

Shabbat clothing on shabbat chazon

SQL Minimum Row count

Improving software when the author can see no need for improvement



Are there any differences in causality between linear and logistic regression?


Logistic regression and ordinal independent variablesDifferences between logistic regression and perceptronsUnder what conditions does correlation imply proximity to causation?Does rejection of null hypothesis in multiple regression entail causation?Why can multicollinearity be a problem for logistic regression?The Book of Why by Judea Pearl: Why is he bashing statistics?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








1












$begingroup$


I'm guessing this is a pretty basic question, but I am having a hard time wrapping my head around it.



So my understanding with linear regression, is that it shows how much a change in X, will cause a change in Y. And the same with multiple linear regression.



But can the same be said about logistic regression? What if both of the variables are nominal? Can you do logistic regression this way?



I am currently running an ordinal variable against a nominal one, and I get similar results when I alternate independent vs dependent.
So, my question is should logistic regression be viewed as explaining causal relationships as we do with linear regression? Or is it possible that causality is overwritten by multicollinearity? Leaving us with strictly correlational inferences?
Thanks










share|cite|improve this question







New contributor



Jake Stidham is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$









  • 5




    $begingroup$
    Linear regression is not causal in the usual sense of causal
    $endgroup$
    – Henry
    8 hours ago

















1












$begingroup$


I'm guessing this is a pretty basic question, but I am having a hard time wrapping my head around it.



So my understanding with linear regression, is that it shows how much a change in X, will cause a change in Y. And the same with multiple linear regression.



But can the same be said about logistic regression? What if both of the variables are nominal? Can you do logistic regression this way?



I am currently running an ordinal variable against a nominal one, and I get similar results when I alternate independent vs dependent.
So, my question is should logistic regression be viewed as explaining causal relationships as we do with linear regression? Or is it possible that causality is overwritten by multicollinearity? Leaving us with strictly correlational inferences?
Thanks










share|cite|improve this question







New contributor



Jake Stidham is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$









  • 5




    $begingroup$
    Linear regression is not causal in the usual sense of causal
    $endgroup$
    – Henry
    8 hours ago













1












1








1





$begingroup$


I'm guessing this is a pretty basic question, but I am having a hard time wrapping my head around it.



So my understanding with linear regression, is that it shows how much a change in X, will cause a change in Y. And the same with multiple linear regression.



But can the same be said about logistic regression? What if both of the variables are nominal? Can you do logistic regression this way?



I am currently running an ordinal variable against a nominal one, and I get similar results when I alternate independent vs dependent.
So, my question is should logistic regression be viewed as explaining causal relationships as we do with linear regression? Or is it possible that causality is overwritten by multicollinearity? Leaving us with strictly correlational inferences?
Thanks










share|cite|improve this question







New contributor



Jake Stidham is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$




I'm guessing this is a pretty basic question, but I am having a hard time wrapping my head around it.



So my understanding with linear regression, is that it shows how much a change in X, will cause a change in Y. And the same with multiple linear regression.



But can the same be said about logistic regression? What if both of the variables are nominal? Can you do logistic regression this way?



I am currently running an ordinal variable against a nominal one, and I get similar results when I alternate independent vs dependent.
So, my question is should logistic regression be viewed as explaining causal relationships as we do with linear regression? Or is it possible that causality is overwritten by multicollinearity? Leaving us with strictly correlational inferences?
Thanks







logistic causality






share|cite|improve this question







New contributor



Jake Stidham is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.










share|cite|improve this question







New contributor



Jake Stidham is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.








share|cite|improve this question




share|cite|improve this question






New contributor



Jake Stidham is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.








asked 8 hours ago









Jake StidhamJake Stidham

61 bronze badge




61 bronze badge




New contributor



Jake Stidham is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.




New contributor




Jake Stidham is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.












  • 5




    $begingroup$
    Linear regression is not causal in the usual sense of causal
    $endgroup$
    – Henry
    8 hours ago












  • 5




    $begingroup$
    Linear regression is not causal in the usual sense of causal
    $endgroup$
    – Henry
    8 hours ago







5




5




$begingroup$
Linear regression is not causal in the usual sense of causal
$endgroup$
– Henry
8 hours ago




$begingroup$
Linear regression is not causal in the usual sense of causal
$endgroup$
– Henry
8 hours ago










3 Answers
3






active

oldest

votes


















4












$begingroup$

Causality has nothing to do with regression. You can regress any variables that are not causally linked. Better way of thinking of regression is "response of Y to X", or "relationship of Y and X". And in this regard it does not matter if the link function is identical (as in the normal regression) or logit function (as in the logistic regression). In logistic regression the response of Y to X will just have different shape due to the logit link function than it would have in normal regression.



So the answer is no, there are no differences in causality.






share|cite|improve this answer











$endgroup$






















    2












    $begingroup$

    Neither of them establish causality - unless we talk about specific experimental set-ups (e.g. randomized studies). They show correlation. E.g. a linear model for maximum daytime temperature with the ice cream sales as a predictor cannot show that selling ice cream causes higher temperature, even if higher ice cream sales correlate with higher temperatures. Of course, I could come up with a whole convoluted story of how this works (e.g. all those freezers increase the outdoors temperature in order to produce ice cream), but that would all be post-hoc reasoning. In my example, we can of course immediately tell that I probably got my causality the wrong way around, but in many practical examples that is a lot less clear.



    Just the same thing is happening, if I do a logistic regression for "Was it a very hot day" (yes/no).






    share|cite|improve this answer









    $endgroup$






















      0












      $begingroup$

      A causal relationship is defined by a structural model that relates an outcome to its causes. Regression is one way of estimating the parameters of the structural causal model (there are other ways). If the structural model takes the form of a logistic regression model, then a logistic regression model is one way of recovering the true causal parameter. If the structural model takes the form of a linear regression model, then a linear regression model is one way of recovering the true causal parameter. With a binary outcome, a logistic regression model makes more sense because the the output of a logistic regression model can be interpreted as a probability, and part of the outcome-generating process may involve drawing a 0 or 1 with a given probability. With a continuous outcome, a linear regression may make more sense. The choice of model you use should depend on the form of the structural causal model you are trying to approximate. (Note that both linear and logistic regression can be used for both binary and continuous outcomes.)



      What many of the posters here are arguing is that regression (logistic or linear) are not inherently causal methods; they can be used to extract a causal parameter from data, but they can be used for other purposes, too. My view is that as long as the predictor precedes the outcome temporally, regression is inherently causal, regardless of your study design. The difference is that the parameter you may estimate from your model may be a biased estimate of the causal parameter. The degree of bias depends on qualities of your design (e.g., whether you have randomized your treatment, whether you are implicitly conditioning on a consequent of treatment, whether you have collected enough variables to remove confounding). Including covariates in a linear or logistic regression is one way to attempt to remove the bias of a causal effect estimate.



      When the structural causal model is perfectly reproduced by the model you specify, the causal parameter of interest will be estimated without bias. This tends not to be the case, and so estimating causal parameters using regression can leave you with a biased estimate (if bias is induced by your design). It is for this reason that other answers have been strong in claiming regression is not an inherently causal method; in a bias-inducing design, it's almost impossible to specify a regression model that correctly reproduces the structural causal model that underlies the data.



      A final note: people often say correlation does not imply causation, but correlation without confounding does imply causation. Regression is one way to remove confounding. The effectiveness of logistic regression or linear regression at doing so depends on generally unknown qualities of the data-generating structural causal model.






      share|cite|improve this answer









      $endgroup$

















        Your Answer








        StackExchange.ready(function()
        var channelOptions =
        tags: "".split(" "),
        id: "65"
        ;
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function()
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled)
        StackExchange.using("snippets", function()
        createEditor();
        );

        else
        createEditor();

        );

        function createEditor()
        StackExchange.prepareEditor(
        heartbeatType: 'answer',
        autoActivateHeartbeat: false,
        convertImagesToLinks: false,
        noModals: true,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: null,
        bindNavPrevention: true,
        postfix: "",
        imageUploader:
        brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
        contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
        allowUrls: true
        ,
        onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        );



        );






        Jake Stidham is a new contributor. Be nice, and check out our Code of Conduct.









        draft saved

        draft discarded


















        StackExchange.ready(
        function ()
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f421479%2fare-there-any-differences-in-causality-between-linear-and-logistic-regression%23new-answer', 'question_page');

        );

        Post as a guest















        Required, but never shown

























        3 Answers
        3






        active

        oldest

        votes








        3 Answers
        3






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes









        4












        $begingroup$

        Causality has nothing to do with regression. You can regress any variables that are not causally linked. Better way of thinking of regression is "response of Y to X", or "relationship of Y and X". And in this regard it does not matter if the link function is identical (as in the normal regression) or logit function (as in the logistic regression). In logistic regression the response of Y to X will just have different shape due to the logit link function than it would have in normal regression.



        So the answer is no, there are no differences in causality.






        share|cite|improve this answer











        $endgroup$



















          4












          $begingroup$

          Causality has nothing to do with regression. You can regress any variables that are not causally linked. Better way of thinking of regression is "response of Y to X", or "relationship of Y and X". And in this regard it does not matter if the link function is identical (as in the normal regression) or logit function (as in the logistic regression). In logistic regression the response of Y to X will just have different shape due to the logit link function than it would have in normal regression.



          So the answer is no, there are no differences in causality.






          share|cite|improve this answer











          $endgroup$

















            4












            4








            4





            $begingroup$

            Causality has nothing to do with regression. You can regress any variables that are not causally linked. Better way of thinking of regression is "response of Y to X", or "relationship of Y and X". And in this regard it does not matter if the link function is identical (as in the normal regression) or logit function (as in the logistic regression). In logistic regression the response of Y to X will just have different shape due to the logit link function than it would have in normal regression.



            So the answer is no, there are no differences in causality.






            share|cite|improve this answer











            $endgroup$



            Causality has nothing to do with regression. You can regress any variables that are not causally linked. Better way of thinking of regression is "response of Y to X", or "relationship of Y and X". And in this regard it does not matter if the link function is identical (as in the normal regression) or logit function (as in the logistic regression). In logistic regression the response of Y to X will just have different shape due to the logit link function than it would have in normal regression.



            So the answer is no, there are no differences in causality.







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited 6 hours ago

























            answered 6 hours ago









            CuriousCurious

            3,1186 gold badges43 silver badges78 bronze badges




            3,1186 gold badges43 silver badges78 bronze badges


























                2












                $begingroup$

                Neither of them establish causality - unless we talk about specific experimental set-ups (e.g. randomized studies). They show correlation. E.g. a linear model for maximum daytime temperature with the ice cream sales as a predictor cannot show that selling ice cream causes higher temperature, even if higher ice cream sales correlate with higher temperatures. Of course, I could come up with a whole convoluted story of how this works (e.g. all those freezers increase the outdoors temperature in order to produce ice cream), but that would all be post-hoc reasoning. In my example, we can of course immediately tell that I probably got my causality the wrong way around, but in many practical examples that is a lot less clear.



                Just the same thing is happening, if I do a logistic regression for "Was it a very hot day" (yes/no).






                share|cite|improve this answer









                $endgroup$



















                  2












                  $begingroup$

                  Neither of them establish causality - unless we talk about specific experimental set-ups (e.g. randomized studies). They show correlation. E.g. a linear model for maximum daytime temperature with the ice cream sales as a predictor cannot show that selling ice cream causes higher temperature, even if higher ice cream sales correlate with higher temperatures. Of course, I could come up with a whole convoluted story of how this works (e.g. all those freezers increase the outdoors temperature in order to produce ice cream), but that would all be post-hoc reasoning. In my example, we can of course immediately tell that I probably got my causality the wrong way around, but in many practical examples that is a lot less clear.



                  Just the same thing is happening, if I do a logistic regression for "Was it a very hot day" (yes/no).






                  share|cite|improve this answer









                  $endgroup$

















                    2












                    2








                    2





                    $begingroup$

                    Neither of them establish causality - unless we talk about specific experimental set-ups (e.g. randomized studies). They show correlation. E.g. a linear model for maximum daytime temperature with the ice cream sales as a predictor cannot show that selling ice cream causes higher temperature, even if higher ice cream sales correlate with higher temperatures. Of course, I could come up with a whole convoluted story of how this works (e.g. all those freezers increase the outdoors temperature in order to produce ice cream), but that would all be post-hoc reasoning. In my example, we can of course immediately tell that I probably got my causality the wrong way around, but in many practical examples that is a lot less clear.



                    Just the same thing is happening, if I do a logistic regression for "Was it a very hot day" (yes/no).






                    share|cite|improve this answer









                    $endgroup$



                    Neither of them establish causality - unless we talk about specific experimental set-ups (e.g. randomized studies). They show correlation. E.g. a linear model for maximum daytime temperature with the ice cream sales as a predictor cannot show that selling ice cream causes higher temperature, even if higher ice cream sales correlate with higher temperatures. Of course, I could come up with a whole convoluted story of how this works (e.g. all those freezers increase the outdoors temperature in order to produce ice cream), but that would all be post-hoc reasoning. In my example, we can of course immediately tell that I probably got my causality the wrong way around, but in many practical examples that is a lot less clear.



                    Just the same thing is happening, if I do a logistic regression for "Was it a very hot day" (yes/no).







                    share|cite|improve this answer












                    share|cite|improve this answer



                    share|cite|improve this answer










                    answered 6 hours ago









                    BjörnBjörn

                    12.4k2 gold badges11 silver badges45 bronze badges




                    12.4k2 gold badges11 silver badges45 bronze badges
























                        0












                        $begingroup$

                        A causal relationship is defined by a structural model that relates an outcome to its causes. Regression is one way of estimating the parameters of the structural causal model (there are other ways). If the structural model takes the form of a logistic regression model, then a logistic regression model is one way of recovering the true causal parameter. If the structural model takes the form of a linear regression model, then a linear regression model is one way of recovering the true causal parameter. With a binary outcome, a logistic regression model makes more sense because the the output of a logistic regression model can be interpreted as a probability, and part of the outcome-generating process may involve drawing a 0 or 1 with a given probability. With a continuous outcome, a linear regression may make more sense. The choice of model you use should depend on the form of the structural causal model you are trying to approximate. (Note that both linear and logistic regression can be used for both binary and continuous outcomes.)



                        What many of the posters here are arguing is that regression (logistic or linear) are not inherently causal methods; they can be used to extract a causal parameter from data, but they can be used for other purposes, too. My view is that as long as the predictor precedes the outcome temporally, regression is inherently causal, regardless of your study design. The difference is that the parameter you may estimate from your model may be a biased estimate of the causal parameter. The degree of bias depends on qualities of your design (e.g., whether you have randomized your treatment, whether you are implicitly conditioning on a consequent of treatment, whether you have collected enough variables to remove confounding). Including covariates in a linear or logistic regression is one way to attempt to remove the bias of a causal effect estimate.



                        When the structural causal model is perfectly reproduced by the model you specify, the causal parameter of interest will be estimated without bias. This tends not to be the case, and so estimating causal parameters using regression can leave you with a biased estimate (if bias is induced by your design). It is for this reason that other answers have been strong in claiming regression is not an inherently causal method; in a bias-inducing design, it's almost impossible to specify a regression model that correctly reproduces the structural causal model that underlies the data.



                        A final note: people often say correlation does not imply causation, but correlation without confounding does imply causation. Regression is one way to remove confounding. The effectiveness of logistic regression or linear regression at doing so depends on generally unknown qualities of the data-generating structural causal model.






                        share|cite|improve this answer









                        $endgroup$



















                          0












                          $begingroup$

                          A causal relationship is defined by a structural model that relates an outcome to its causes. Regression is one way of estimating the parameters of the structural causal model (there are other ways). If the structural model takes the form of a logistic regression model, then a logistic regression model is one way of recovering the true causal parameter. If the structural model takes the form of a linear regression model, then a linear regression model is one way of recovering the true causal parameter. With a binary outcome, a logistic regression model makes more sense because the the output of a logistic regression model can be interpreted as a probability, and part of the outcome-generating process may involve drawing a 0 or 1 with a given probability. With a continuous outcome, a linear regression may make more sense. The choice of model you use should depend on the form of the structural causal model you are trying to approximate. (Note that both linear and logistic regression can be used for both binary and continuous outcomes.)



                          What many of the posters here are arguing is that regression (logistic or linear) are not inherently causal methods; they can be used to extract a causal parameter from data, but they can be used for other purposes, too. My view is that as long as the predictor precedes the outcome temporally, regression is inherently causal, regardless of your study design. The difference is that the parameter you may estimate from your model may be a biased estimate of the causal parameter. The degree of bias depends on qualities of your design (e.g., whether you have randomized your treatment, whether you are implicitly conditioning on a consequent of treatment, whether you have collected enough variables to remove confounding). Including covariates in a linear or logistic regression is one way to attempt to remove the bias of a causal effect estimate.



                          When the structural causal model is perfectly reproduced by the model you specify, the causal parameter of interest will be estimated without bias. This tends not to be the case, and so estimating causal parameters using regression can leave you with a biased estimate (if bias is induced by your design). It is for this reason that other answers have been strong in claiming regression is not an inherently causal method; in a bias-inducing design, it's almost impossible to specify a regression model that correctly reproduces the structural causal model that underlies the data.



                          A final note: people often say correlation does not imply causation, but correlation without confounding does imply causation. Regression is one way to remove confounding. The effectiveness of logistic regression or linear regression at doing so depends on generally unknown qualities of the data-generating structural causal model.






                          share|cite|improve this answer









                          $endgroup$

















                            0












                            0








                            0





                            $begingroup$

                            A causal relationship is defined by a structural model that relates an outcome to its causes. Regression is one way of estimating the parameters of the structural causal model (there are other ways). If the structural model takes the form of a logistic regression model, then a logistic regression model is one way of recovering the true causal parameter. If the structural model takes the form of a linear regression model, then a linear regression model is one way of recovering the true causal parameter. With a binary outcome, a logistic regression model makes more sense because the the output of a logistic regression model can be interpreted as a probability, and part of the outcome-generating process may involve drawing a 0 or 1 with a given probability. With a continuous outcome, a linear regression may make more sense. The choice of model you use should depend on the form of the structural causal model you are trying to approximate. (Note that both linear and logistic regression can be used for both binary and continuous outcomes.)



                            What many of the posters here are arguing is that regression (logistic or linear) are not inherently causal methods; they can be used to extract a causal parameter from data, but they can be used for other purposes, too. My view is that as long as the predictor precedes the outcome temporally, regression is inherently causal, regardless of your study design. The difference is that the parameter you may estimate from your model may be a biased estimate of the causal parameter. The degree of bias depends on qualities of your design (e.g., whether you have randomized your treatment, whether you are implicitly conditioning on a consequent of treatment, whether you have collected enough variables to remove confounding). Including covariates in a linear or logistic regression is one way to attempt to remove the bias of a causal effect estimate.



                            When the structural causal model is perfectly reproduced by the model you specify, the causal parameter of interest will be estimated without bias. This tends not to be the case, and so estimating causal parameters using regression can leave you with a biased estimate (if bias is induced by your design). It is for this reason that other answers have been strong in claiming regression is not an inherently causal method; in a bias-inducing design, it's almost impossible to specify a regression model that correctly reproduces the structural causal model that underlies the data.



                            A final note: people often say correlation does not imply causation, but correlation without confounding does imply causation. Regression is one way to remove confounding. The effectiveness of logistic regression or linear regression at doing so depends on generally unknown qualities of the data-generating structural causal model.






                            share|cite|improve this answer









                            $endgroup$



                            A causal relationship is defined by a structural model that relates an outcome to its causes. Regression is one way of estimating the parameters of the structural causal model (there are other ways). If the structural model takes the form of a logistic regression model, then a logistic regression model is one way of recovering the true causal parameter. If the structural model takes the form of a linear regression model, then a linear regression model is one way of recovering the true causal parameter. With a binary outcome, a logistic regression model makes more sense because the the output of a logistic regression model can be interpreted as a probability, and part of the outcome-generating process may involve drawing a 0 or 1 with a given probability. With a continuous outcome, a linear regression may make more sense. The choice of model you use should depend on the form of the structural causal model you are trying to approximate. (Note that both linear and logistic regression can be used for both binary and continuous outcomes.)



                            What many of the posters here are arguing is that regression (logistic or linear) are not inherently causal methods; they can be used to extract a causal parameter from data, but they can be used for other purposes, too. My view is that as long as the predictor precedes the outcome temporally, regression is inherently causal, regardless of your study design. The difference is that the parameter you may estimate from your model may be a biased estimate of the causal parameter. The degree of bias depends on qualities of your design (e.g., whether you have randomized your treatment, whether you are implicitly conditioning on a consequent of treatment, whether you have collected enough variables to remove confounding). Including covariates in a linear or logistic regression is one way to attempt to remove the bias of a causal effect estimate.



                            When the structural causal model is perfectly reproduced by the model you specify, the causal parameter of interest will be estimated without bias. This tends not to be the case, and so estimating causal parameters using regression can leave you with a biased estimate (if bias is induced by your design). It is for this reason that other answers have been strong in claiming regression is not an inherently causal method; in a bias-inducing design, it's almost impossible to specify a regression model that correctly reproduces the structural causal model that underlies the data.



                            A final note: people often say correlation does not imply causation, but correlation without confounding does imply causation. Regression is one way to remove confounding. The effectiveness of logistic regression or linear regression at doing so depends on generally unknown qualities of the data-generating structural causal model.







                            share|cite|improve this answer












                            share|cite|improve this answer



                            share|cite|improve this answer










                            answered 3 hours ago









                            NoahNoah

                            5,5141 gold badge5 silver badges20 bronze badges




                            5,5141 gold badge5 silver badges20 bronze badges























                                Jake Stidham is a new contributor. Be nice, and check out our Code of Conduct.









                                draft saved

                                draft discarded


















                                Jake Stidham is a new contributor. Be nice, and check out our Code of Conduct.












                                Jake Stidham is a new contributor. Be nice, and check out our Code of Conduct.











                                Jake Stidham is a new contributor. Be nice, and check out our Code of Conduct.














                                Thanks for contributing an answer to Cross Validated!


                                • Please be sure to answer the question. Provide details and share your research!

                                But avoid


                                • Asking for help, clarification, or responding to other answers.

                                • Making statements based on opinion; back them up with references or personal experience.

                                Use MathJax to format equations. MathJax reference.


                                To learn more, see our tips on writing great answers.




                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function ()
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f421479%2fare-there-any-differences-in-causality-between-linear-and-logistic-regression%23new-answer', 'question_page');

                                );

                                Post as a guest















                                Required, but never shown





















































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown

































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown







                                Popular posts from this blog

                                Invision Community Contents History See also References External links Navigation menuProprietaryinvisioncommunity.comIPS Community ForumsIPS Community Forumsthis blog entry"License Changes, IP.Board 3.4, and the Future""Interview -- Matt Mecham of Ibforums""CEO Invision Power Board, Matt Mecham Is a Liar, Thief!"IPB License Explanation 1.3, 1.3.1, 2.0, and 2.1ArchivedSecurity Fixes, Updates And Enhancements For IPB 1.3.1Archived"New Demo Accounts - Invision Power Services"the original"New Default Skin"the original"Invision Power Board 3.0.0 and Applications Released"the original"Archived copy"the original"Perpetual licenses being done away with""Release Notes - Invision Power Services""Introducing: IPS Community Suite 4!"Invision Community Release Notes

                                Canceling a color specificationRandomly assigning color to Graphics3D objects?Default color for Filling in Mathematica 9Coloring specific elements of sets with a prime modified order in an array plotHow to pick a color differing significantly from the colors already in a given color list?Detection of the text colorColor numbers based on their valueCan color schemes for use with ColorData include opacity specification?My dynamic color schemes

                                Ласкавець круглолистий Зміст Опис | Поширення | Галерея | Примітки | Посилання | Навігаційне меню58171138361-22960890446Bupleurum rotundifoliumEuro+Med PlantbasePlants of the World Online — Kew ScienceGermplasm Resources Information Network (GRIN)Ласкавецькн. VI : Літери Ком — Левиправивши або дописавши її