SPSS Regression Output of Interest Model Summary Table: • R: Multiple correlation coefficient between all the predictors in the model and the dependent variable. • R2: Proportion of variance in the dependent variable predictable by the predictor variables • Adjusted R2: As more predictors are added to the model equation, they will explain more variance just by chance—this “shrunked R-squared” adjusts (or penalizes) the R-sqr dependent on the number of variables used in the equation. • Std. Error of the Estimate: The standard error of the estimate, also called the root mean square error, is the standard deviation of the error term, and is the square root of the Mean Square Residual (or Error). • R2 change: This is the proportion of variance that this variable explains above and beyond variables already entered (only in hierarchical). It also gives you an Fratio associated with the variance change to determine if the addition of this variable to the equation is introducing a significant increase in variance explained. ANOVA Table: • This reads just like our other ANOVA source tables, and tells you the F-ratio (and likelihood) associated with the amount of variance your predictors explain in the dependent variable. “Regression” is your effect and “Residual” is your error. Coefficients (Parameter Estimates): • Model: In hierarchical, you will have more than one model you are comparing. This just shows the predictors used in your equation. • B: These are the values for the regression equation for predicting the dependent variable from the independent variable. These are called unstandardized coefficients because they are measured in their natural units. As such, the coefficients cannot be compared with one another to determine which one is more influential in the model, because they can be measured on different scales. • Part/Partial Correlations: A partial correlation is the correlation between the dependent variable and the predictor after removing the correlation that is due to their mutual assocation with the other variables. A part correlation (aka: semipartial) is the correlation between the predictor and DV when the effects of the other predictors have been removed from the predictor of interest. Partial Corr = unique variance, Semipartial = change in R2 when a variable is added to the equation. Remember, our regression equation is simply Yˆ = b0 + b1 X1 + b2 X 2 Kbk X k or: PREDICTED = Constant + B1(VAR1) + B2(VAR2) + ... + Bk(VARk) • • • ! Std. Error: These are the standard errors associated with the coefficients. The standard error is used for testing whether the parameter is significantly different from 0 by dividing the parameter estimate by the standard error to obtain a t-value (see the column with t-values and p-values). Beta: These are the standardized coefficients. These are the coefficients that you would obtain if you standardized all of the variables in the regression, including the dependent and all of the independent variables, and ran the regression. By standardizing the variables before running the regression, you have put all of the variables on the same scale, and you can compare the magnitude of the coefficients to see which one has more of an effect. You will also notice that the larger betas are associated with the larger t-values. You get a t-value and associated significance for each predictor—this tells you if the variable is explaining significant unique variance (i.e., is it “useful” or not).
© Copyright 2026 Paperzz