About 347,000 results
Open links in new tab
  1. regression - Interpreting the residuals vs. fitted values plot for ...

    Therefore, the second and third plots, which seem to indicate dependency between the residuals and the fitted values, suggest a different model. But why does the second plot suggest, as …

  2. How should outliers be dealt with in linear regression analysis?

    Often times a statistical analyst is handed a set dataset and asked to fit a model using a technique such as linear regression. Very frequently the dataset is accompanied with a disclaimer similar...

  3. regression - When is R squared negative? - Cross Validated

    With linear regression with no constraints, R2 R 2 must be positive (or zero) and equals the square of the correlation coefficient, r r. A negative R2 R 2 is only possible with linear …

  4. Minimal number of points for a linear regression

    Feb 10, 2023 · What would be a "reasonable" minimal number of observations to look for a trend over time with a linear regression? what about fitting a quadratic model? I work with composite …

  5. Why is ANOVA equivalent to linear regression? - Cross Validated

    Oct 4, 2015 · ANOVA and linear regression are equivalent when the two models test against the same hypotheses and use an identical encoding. The models differ in their basic aim: ANOVA …

  6. regression - Why does adding more terms into a linear model …

    Jan 12, 2015 · Many statistics textbooks state that adding more terms into a linear model always reduces the sum of squares and in turn increases the r-squared value. This has led to the use …

  7. regression - Does over fitting a model affect R Squared only or ...

    Sep 10, 2019 · In a nice straightforward linear model (no penalization of parameters, no model building, just a single pre-specified model etc.) it is meant to tell you what proportion of the …

  8. Linear Regression For Binary Independent Variables - Interpretation

    Jan 18, 2019 · For linear regression, you would code the variables as dummy variables (1/0 for presence/absence) and interpret the predictors as "the presence of this variable increases …

  9. What happens when we introduce more variables to a linear …

    Feb 22, 2020 · What happens when we introduce more variables to a linear regression model? Ask Question Asked 5 years, 9 months ago Modified 4 years, 7 months ago

  10. When is it ok to remove the intercept in a linear regression model ...

    The standard regression model is parametrized as intercept + k - 1 dummy vectors. The intercept codes the expected value for the "reference" group, or the omitted vector, and the remaining …