Multiple Regression in SPSS - R Square; P-Value; ANOVA F; Beta (Part 3 of 3)



8
100341

This video illustrates how to perform and interpret a multiple regression statistical analysis in SPSS. Video Transcript: here's the results here, so SAT is significant, social support once again was significant, and then gender was not significant, as its p-value is greater than .05. Now an important point in this table here is, that if a test is significant, that means that the amount of unique variance a predictor accounts for is statistically significant. So in other words, SAT score, since it was significant, it accounts for a significant amount of unique variance in college GPA. And what we mean by unique there, is that the amount of variance that SAT score accounts for, predicts, or explains in college GPA unique to itself, is significant. Now when I say unique to itself that means that SAT score explains something in college GPA that social support and gender did not explain or didn't get at. So, in other words, SAT score explains uniquely, all to its own, a significant amount of variance in college GPA. OK since social support is significant, that also means that social support explained a significant amount of unique variance in college GPA. So that's what these tests are in the Coefficients table. And as a way to try to understand this a little bit better, suppose that SAT score and social support were perfectly correlated, just hypothetically. So if SAT score and social support were correlated perfectly, they had a correlation of 1.0. If I ran this regression analysis, and I was able to get it to run successfully, there are some problems when variables are perfectly correlated that can occur, which I'll talk about in another video, but let's assume that it ran fine everything came out. If these two variables were perfectly correlated, then the p-values for both of these would not be significant, and in fact they should be 1.0 if they are perfectly correlated. Because if SAT score and social support were perfectly correlated, then that would mean that SAT score offers nothing uniquely in terms of predicting college GPA. And social support offers nothing uniquely in terms of predicting college GPA, because whatever social support offers, SAT also offers completely. So they offer nothing uniquely if they were in fact correlated perfectly. So if a test is significant here, we know that it's offering a unique contribution to our dependent variable, or college GPA in this example. So that's important to note and it's an area that's often confused in regression. So in summary once again the Model Summary and ANOVA tables, those tell us overall did our model, with all the predictors included, what was the R-squared first of all, how much variance did it account for, that's our R-squared, and then was that variance that it accounted for statistically significant, that's the ANOVA, the p-value here. And then Coefficients once again told us, on an individual level, which if any of the predictors are statistically significant. And if a predictor is significant, recall that that means that it accounts for a unique amount of variance in the dependent or criterion variable. OK one last thing I want to talk about before closing, and that is in multiple regression, if you have a categorical variable that is dichotomous, that is, it has two categories, such as gender, it's completely fine to enter into the analysis as we did, where we just go ahead and move it over into our analysis into our Independent(s) box. If there's two categories to the variable, that's completely fine. But if you have a categorical variable that has more than two categories, such as let's say ethnicity, and it had four categories, then you can't just enter that variable into the regression analysis directly, but instead you need to re-express that variable first prior to entering it. So if I have a categorical variable that has more than two categories, I can't enter it directly into the analysis. But for all quantitative variables, or what one can think of as a continuous variable, they can always be moved in directly without a problem, it only occurs with categorical variables that have more than two levels. OK that's it for an overview of multiple regression. Thanks for watching. R-Squared ANOVA table Regression Weight Beta Weight Predicted Value YouTube Channel (Quantitative Specialists): https://www.youtube.com/user/statisticsinstructor Subscribe today!

Published by: Quantitative Specialists Published at: 9 years ago Category: آموزشی