R-Squared vs. Adjusted R-Squared: An Overview

One major difference between R-squared and the adjusted R-squared is that R-squared supposes that every independent variable in the model explains the variation in the dependent variable. It gives the percentage of explained variation as if all independent variables in the model affect the dependent variable. Adjusted R-squared, on the other hand, gives the percentage of variation explained by only those independent variables that in reality affect the dependent variable.

R-Squared

R-squared cannot verify whether the coefficient ballpark figure and its predictions are prejudiced. It also does not show if a regression model is satisfactory; it can show an R-squared figure for a good model or a high R-squared figure for a model that doesn’t fit.

Adjusted R-Squared

The adjusted R-squared compares the descriptive power of regression models that include a diverse number of predictors. Every predictor added to a model increases R-squared and never decreases it. Thus, a model with more terms may seem to have a better fit just for the fact that it has more terms.

The adjusted R-squared compensates for the addition of variables and only increases if the new term enhances the model above what would be obtained by probability and decreases when a predictor enhances the model less than what is predicted by chance. In an overfitting condition, an incorrectly high value of R-squared, which leads to a decreased ability to predict, is obtained, which is not the case with the adjusted R-squared.

[Important: R-Squared is often used with linear regressions to help predict stock price movements, but it's just one of many technical indicators that traders should have in their arsenals. Investopedia's Technical Analysis course provides a comprehensive overview of technical indicators and chart patterns with over five hours of on-demand video. You will learn all of the most popular techniques and how to use them in real-life markets to maximize risk-adjusted returns.]

The adjusted R-squared is a modified version of R-squared for the number of predictors in a model. The adjusted R-squared can be negative, but isn't always, while an R-squared value is between 0 and 100 and shows the linear relationship in the sample of data even when there is no basic relationship. The adjusted R-squared is the best estimate of the degree of relationship in the basic population. To show the correlation of models with R-squared, pick the model with the highest limit, but the best and easiest way to compare models is to select one with the smaller adjusted R-squared. Adjusted R-squared is not a typical model for comparing nonlinear models but, instead, multiple linear regressions.

Key Takeaways:

  • One major difference between R-squared and the adjusted R-squared is that R-squared supposes that every independent variable in the model explains the variation in the dependent variable.
  • R-squared cannot verify whether the coefficient ballpark figure and its predictions are prejudiced.
  • The adjusted R-squared is a modified version of R-squared for the number of predictors in a model.