Using PCA vs Linear RegressionMaking sense of principal component analysis, eigenvectors & eigenvaluesAnalysis of compounds using PCA - selecting the right PCA “type” for the data…?Forecasting with use of PCA variables as independent and one ternary dependent variable in REstablishing an empirical relationship among environmental properties using PCA and Multiple RegressionPrincipal component regression (PCR) with some of the original predictors left out of PCAShould I use dummy variables or just assign numerical values to categorical predictors in regression / PCA?PCA too slow when both n,p are large: Alternatives?PCA and visualization using biplots on data with mixed typesHow to weight composites based on PCA with longitudinal data?SVM/Linear Regression after PCA and making up numbersUsing Linear Regression on Principal Components in R studio
If a problem only occurs randomly once in every N times on average, how many tests do I have to perform to be certain that it's now fixed?
Different PCB color ( is it different material? )
Intuition behind eigenvalues of an adjacency matrix
Self-Preservation: How to DM NPCs that Love Living?
How to prevent bad sectors?
What does "Marchentalender" on the front of a postcard mean?
How to make the POV character sit on the sidelines without the reader getting bored
Uncommanded roll at high speed
Can a helicopter mask itself from Radar?
Where can I find the list of all tendons in the human body?
Can't connect to Internet in bash using Mac OS
Do creatures all have the same statistics upon being reanimated via the Animate Dead spell?
Is there a rule that prohibits us from using 2 possessives in a row?
What is the difference between nullifying your vote and not going to vote at all?
What is the intuition behind uniform continuity?
What is the indigenous Russian word for a wild boar?
Possible nonclassical ion from a bicyclic system
Asking bank to reduce APR instead of increasing credit limit
Tic-Tac-Toe for the terminal
Points within polygons in different projections
Looking after a wayward brother in mother's will
How do I subvert the tropes of a train heist?
Term for checking piece whose opponent daren't capture it
Fastest way to perform complex search on pandas dataframe
Using PCA vs Linear Regression
Making sense of principal component analysis, eigenvectors & eigenvaluesAnalysis of compounds using PCA - selecting the right PCA “type” for the data…?Forecasting with use of PCA variables as independent and one ternary dependent variable in REstablishing an empirical relationship among environmental properties using PCA and Multiple RegressionPrincipal component regression (PCR) with some of the original predictors left out of PCAShould I use dummy variables or just assign numerical values to categorical predictors in regression / PCA?PCA too slow when both n,p are large: Alternatives?PCA and visualization using biplots on data with mixed typesHow to weight composites based on PCA with longitudinal data?SVM/Linear Regression after PCA and making up numbersUsing Linear Regression on Principal Components in R studio
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
$begingroup$
I'm looking to analyzing data from a study and previous studies that are similar have used either PCA or hierarchical linear regression to analyze the data. I've used both PCA and linear regression previously. From my understanding PCA breaks the data down into principal components and is useful for learning what factors may be strong indicators of our dependent variable, and that linear regression can be used to compare correlation.
How should I be approaching this? If I'm simply wanting to find out what correlates the strongest with my studies dependent variable what would be the best option? Can I use both PCA and then hierarchical linear regression?
regression pca
New contributor
$endgroup$
add a comment |
$begingroup$
I'm looking to analyzing data from a study and previous studies that are similar have used either PCA or hierarchical linear regression to analyze the data. I've used both PCA and linear regression previously. From my understanding PCA breaks the data down into principal components and is useful for learning what factors may be strong indicators of our dependent variable, and that linear regression can be used to compare correlation.
How should I be approaching this? If I'm simply wanting to find out what correlates the strongest with my studies dependent variable what would be the best option? Can I use both PCA and then hierarchical linear regression?
regression pca
New contributor
$endgroup$
add a comment |
$begingroup$
I'm looking to analyzing data from a study and previous studies that are similar have used either PCA or hierarchical linear regression to analyze the data. I've used both PCA and linear regression previously. From my understanding PCA breaks the data down into principal components and is useful for learning what factors may be strong indicators of our dependent variable, and that linear regression can be used to compare correlation.
How should I be approaching this? If I'm simply wanting to find out what correlates the strongest with my studies dependent variable what would be the best option? Can I use both PCA and then hierarchical linear regression?
regression pca
New contributor
$endgroup$
I'm looking to analyzing data from a study and previous studies that are similar have used either PCA or hierarchical linear regression to analyze the data. I've used both PCA and linear regression previously. From my understanding PCA breaks the data down into principal components and is useful for learning what factors may be strong indicators of our dependent variable, and that linear regression can be used to compare correlation.
How should I be approaching this? If I'm simply wanting to find out what correlates the strongest with my studies dependent variable what would be the best option? Can I use both PCA and then hierarchical linear regression?
regression pca
regression pca
New contributor
New contributor
edited 4 hours ago
Ben
30.8k235134
30.8k235134
New contributor
asked 8 hours ago
4ntibody4ntibody
111
111
New contributor
New contributor
add a comment |
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
PCA does not involve a dependent variable: All the variables are treated the same. It is primarily dimension reduction method.
Factor analysis also doesn't involve a dependent variable, but its goal is somewhat different: It is to uncover latent factors.
Some people use either the components or the factors (or a subset of them) as independent variables in a later regression. This can be useful if you have a lot of IVs: If you want to reduce the number while losing as little variance as possible, that's PCA. If you think these IVs represent some factors, that's FA.
If you think there are factors, then it may be best to use FA; but if you are just trying to reduce the number of variables, then there is no guarantee that the components will relate well to the DV. Another method is partial least squares. That does include the DV.
$endgroup$
add a comment |
$begingroup$
These techniques are not exclusive, and they can be complimentary.
PCA is a dimension reduction technique. The number of dimensions in your dataset corresponds to the number of observations you have per case. For example, imagine your data is survey data, and you administered a 100 item questionnaire. Each individual who completed the questionnaire is represented by a single point in 100 dimensional space. The goal of PCA is to simplify this space in such a way that the distribution of points is preserved in fewer dimensions. This simplification can help you to describe the data more elegantly, but it can also reveal the dominant trends in your data. A great explanation of PCA can be found here: Making sense of principal component analysis, eigenvectors & eigenvalues
Hierarchical linear regression is used to determine whether a predictor (or set of predictors) explains variance in an outcome variable over and above some other predictor (or set of predictors). For example, you may want to know if exercising (IV1) or eating well (IV2) is a better predictor of cardiovascular health (DV). Hierarchical linear regression can help answer this question.
If your data is complex (i.e. you have many variables) you can apply PCA to reduce the number of variables/find the "latent variables". These latent variables can then be used in the hierarchical linear regression.
Best of luck!
$endgroup$
$begingroup$
Thank you for everybody's quick comments and insight! I now know what i need to do.
$endgroup$
– 4ntibody
7 hours ago
add a comment |
$begingroup$
As other answers have said, PCA and Linear Regression (in general) are different tools.
PCA is an unsupervised method (only takes in data, no dependent variables) and Linear regression (in general) is a supervised learning method. If you have a dependent variable, a supervised method would be suited to your goals.
If you're trying to find out which variables in your data capture most of the variation in the data, PCA is a useful tool.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "65"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
4ntibody is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f410516%2fusing-pca-vs-linear-regression%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
PCA does not involve a dependent variable: All the variables are treated the same. It is primarily dimension reduction method.
Factor analysis also doesn't involve a dependent variable, but its goal is somewhat different: It is to uncover latent factors.
Some people use either the components or the factors (or a subset of them) as independent variables in a later regression. This can be useful if you have a lot of IVs: If you want to reduce the number while losing as little variance as possible, that's PCA. If you think these IVs represent some factors, that's FA.
If you think there are factors, then it may be best to use FA; but if you are just trying to reduce the number of variables, then there is no guarantee that the components will relate well to the DV. Another method is partial least squares. That does include the DV.
$endgroup$
add a comment |
$begingroup$
PCA does not involve a dependent variable: All the variables are treated the same. It is primarily dimension reduction method.
Factor analysis also doesn't involve a dependent variable, but its goal is somewhat different: It is to uncover latent factors.
Some people use either the components or the factors (or a subset of them) as independent variables in a later regression. This can be useful if you have a lot of IVs: If you want to reduce the number while losing as little variance as possible, that's PCA. If you think these IVs represent some factors, that's FA.
If you think there are factors, then it may be best to use FA; but if you are just trying to reduce the number of variables, then there is no guarantee that the components will relate well to the DV. Another method is partial least squares. That does include the DV.
$endgroup$
add a comment |
$begingroup$
PCA does not involve a dependent variable: All the variables are treated the same. It is primarily dimension reduction method.
Factor analysis also doesn't involve a dependent variable, but its goal is somewhat different: It is to uncover latent factors.
Some people use either the components or the factors (or a subset of them) as independent variables in a later regression. This can be useful if you have a lot of IVs: If you want to reduce the number while losing as little variance as possible, that's PCA. If you think these IVs represent some factors, that's FA.
If you think there are factors, then it may be best to use FA; but if you are just trying to reduce the number of variables, then there is no guarantee that the components will relate well to the DV. Another method is partial least squares. That does include the DV.
$endgroup$
PCA does not involve a dependent variable: All the variables are treated the same. It is primarily dimension reduction method.
Factor analysis also doesn't involve a dependent variable, but its goal is somewhat different: It is to uncover latent factors.
Some people use either the components or the factors (or a subset of them) as independent variables in a later regression. This can be useful if you have a lot of IVs: If you want to reduce the number while losing as little variance as possible, that's PCA. If you think these IVs represent some factors, that's FA.
If you think there are factors, then it may be best to use FA; but if you are just trying to reduce the number of variables, then there is no guarantee that the components will relate well to the DV. Another method is partial least squares. That does include the DV.
answered 8 hours ago
Peter Flom♦Peter Flom
78.2k12112220
78.2k12112220
add a comment |
add a comment |
$begingroup$
These techniques are not exclusive, and they can be complimentary.
PCA is a dimension reduction technique. The number of dimensions in your dataset corresponds to the number of observations you have per case. For example, imagine your data is survey data, and you administered a 100 item questionnaire. Each individual who completed the questionnaire is represented by a single point in 100 dimensional space. The goal of PCA is to simplify this space in such a way that the distribution of points is preserved in fewer dimensions. This simplification can help you to describe the data more elegantly, but it can also reveal the dominant trends in your data. A great explanation of PCA can be found here: Making sense of principal component analysis, eigenvectors & eigenvalues
Hierarchical linear regression is used to determine whether a predictor (or set of predictors) explains variance in an outcome variable over and above some other predictor (or set of predictors). For example, you may want to know if exercising (IV1) or eating well (IV2) is a better predictor of cardiovascular health (DV). Hierarchical linear regression can help answer this question.
If your data is complex (i.e. you have many variables) you can apply PCA to reduce the number of variables/find the "latent variables". These latent variables can then be used in the hierarchical linear regression.
Best of luck!
$endgroup$
$begingroup$
Thank you for everybody's quick comments and insight! I now know what i need to do.
$endgroup$
– 4ntibody
7 hours ago
add a comment |
$begingroup$
These techniques are not exclusive, and they can be complimentary.
PCA is a dimension reduction technique. The number of dimensions in your dataset corresponds to the number of observations you have per case. For example, imagine your data is survey data, and you administered a 100 item questionnaire. Each individual who completed the questionnaire is represented by a single point in 100 dimensional space. The goal of PCA is to simplify this space in such a way that the distribution of points is preserved in fewer dimensions. This simplification can help you to describe the data more elegantly, but it can also reveal the dominant trends in your data. A great explanation of PCA can be found here: Making sense of principal component analysis, eigenvectors & eigenvalues
Hierarchical linear regression is used to determine whether a predictor (or set of predictors) explains variance in an outcome variable over and above some other predictor (or set of predictors). For example, you may want to know if exercising (IV1) or eating well (IV2) is a better predictor of cardiovascular health (DV). Hierarchical linear regression can help answer this question.
If your data is complex (i.e. you have many variables) you can apply PCA to reduce the number of variables/find the "latent variables". These latent variables can then be used in the hierarchical linear regression.
Best of luck!
$endgroup$
$begingroup$
Thank you for everybody's quick comments and insight! I now know what i need to do.
$endgroup$
– 4ntibody
7 hours ago
add a comment |
$begingroup$
These techniques are not exclusive, and they can be complimentary.
PCA is a dimension reduction technique. The number of dimensions in your dataset corresponds to the number of observations you have per case. For example, imagine your data is survey data, and you administered a 100 item questionnaire. Each individual who completed the questionnaire is represented by a single point in 100 dimensional space. The goal of PCA is to simplify this space in such a way that the distribution of points is preserved in fewer dimensions. This simplification can help you to describe the data more elegantly, but it can also reveal the dominant trends in your data. A great explanation of PCA can be found here: Making sense of principal component analysis, eigenvectors & eigenvalues
Hierarchical linear regression is used to determine whether a predictor (or set of predictors) explains variance in an outcome variable over and above some other predictor (or set of predictors). For example, you may want to know if exercising (IV1) or eating well (IV2) is a better predictor of cardiovascular health (DV). Hierarchical linear regression can help answer this question.
If your data is complex (i.e. you have many variables) you can apply PCA to reduce the number of variables/find the "latent variables". These latent variables can then be used in the hierarchical linear regression.
Best of luck!
$endgroup$
These techniques are not exclusive, and they can be complimentary.
PCA is a dimension reduction technique. The number of dimensions in your dataset corresponds to the number of observations you have per case. For example, imagine your data is survey data, and you administered a 100 item questionnaire. Each individual who completed the questionnaire is represented by a single point in 100 dimensional space. The goal of PCA is to simplify this space in such a way that the distribution of points is preserved in fewer dimensions. This simplification can help you to describe the data more elegantly, but it can also reveal the dominant trends in your data. A great explanation of PCA can be found here: Making sense of principal component analysis, eigenvectors & eigenvalues
Hierarchical linear regression is used to determine whether a predictor (or set of predictors) explains variance in an outcome variable over and above some other predictor (or set of predictors). For example, you may want to know if exercising (IV1) or eating well (IV2) is a better predictor of cardiovascular health (DV). Hierarchical linear regression can help answer this question.
If your data is complex (i.e. you have many variables) you can apply PCA to reduce the number of variables/find the "latent variables". These latent variables can then be used in the hierarchical linear regression.
Best of luck!
edited 7 hours ago
answered 8 hours ago
unicoderunicoder
164
164
$begingroup$
Thank you for everybody's quick comments and insight! I now know what i need to do.
$endgroup$
– 4ntibody
7 hours ago
add a comment |
$begingroup$
Thank you for everybody's quick comments and insight! I now know what i need to do.
$endgroup$
– 4ntibody
7 hours ago
$begingroup$
Thank you for everybody's quick comments and insight! I now know what i need to do.
$endgroup$
– 4ntibody
7 hours ago
$begingroup$
Thank you for everybody's quick comments and insight! I now know what i need to do.
$endgroup$
– 4ntibody
7 hours ago
add a comment |
$begingroup$
As other answers have said, PCA and Linear Regression (in general) are different tools.
PCA is an unsupervised method (only takes in data, no dependent variables) and Linear regression (in general) is a supervised learning method. If you have a dependent variable, a supervised method would be suited to your goals.
If you're trying to find out which variables in your data capture most of the variation in the data, PCA is a useful tool.
$endgroup$
add a comment |
$begingroup$
As other answers have said, PCA and Linear Regression (in general) are different tools.
PCA is an unsupervised method (only takes in data, no dependent variables) and Linear regression (in general) is a supervised learning method. If you have a dependent variable, a supervised method would be suited to your goals.
If you're trying to find out which variables in your data capture most of the variation in the data, PCA is a useful tool.
$endgroup$
add a comment |
$begingroup$
As other answers have said, PCA and Linear Regression (in general) are different tools.
PCA is an unsupervised method (only takes in data, no dependent variables) and Linear regression (in general) is a supervised learning method. If you have a dependent variable, a supervised method would be suited to your goals.
If you're trying to find out which variables in your data capture most of the variation in the data, PCA is a useful tool.
$endgroup$
As other answers have said, PCA and Linear Regression (in general) are different tools.
PCA is an unsupervised method (only takes in data, no dependent variables) and Linear regression (in general) is a supervised learning method. If you have a dependent variable, a supervised method would be suited to your goals.
If you're trying to find out which variables in your data capture most of the variation in the data, PCA is a useful tool.
answered 5 hours ago
AlexanderAlexander
1194
1194
add a comment |
add a comment |
4ntibody is a new contributor. Be nice, and check out our Code of Conduct.
4ntibody is a new contributor. Be nice, and check out our Code of Conduct.
4ntibody is a new contributor. Be nice, and check out our Code of Conduct.
4ntibody is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f410516%2fusing-pca-vs-linear-regression%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown