- The coefficient of determination. It tells you what percentage of the output variable is explained by the input variable.
- In other words, this tells you what percentage of the dependent variables that are entered around the mean are affected by the independent variables. If you got a 90 percent here, that means that 90 percent of the data is affected by the independent data.
Adjusted R Squared
- This number is used if there is more than one regression model.
- If you run a multiple regression test, with multiple independent variables all at the same time, then this number would be your test number.
- It would act for each variable as the R squared would work for one variable.
Standard Error of the Regression
- This measurement tells how much error was allowed for the overall test.
- It measures the precision of the regression.
- What you're testing for determining what level of error you're willing to accept.
- If you are to use your regression as a first step in building a model, then you will need to pay very close attention to this number.
- If you ran a regression of multiple variables and a few of them had a large error (meaning those variables tested with a very low precision level), then you would disregard these when building your model.
Sum of Squares (SS)
- Regression mean square regression (MSR). This gives the regression sum of squares / regression degrees of freedom, whereas regression Mean Squares is defined as regression mean square error.
- Significance F. This gives the significance level. Critical point for the F distribution is defined as F = MSR / MSE .
F distribution
- An F statistic is a value you get when you run an Analysis of Variance (ANOVA) test of a regression analysis to find out if the means between two population are significantly different.
- It's similar to a t from a t-test.
- A t-test will tell you if a single variable is statistically significant, and an F test will tell you if a group of variables are jointly significant.
(Source : "Statistics 101")