What is it about?

We make the case that polynomial regression with second and third order terms should be part of every applied practitioners standard model-building toolbox, and should be taught to new students of the subject as the default technique to model nonlinearity. Polynomial regression is superior to nonparametric alternatives for non-statisticians due to its ease of interpretation, flexibility, and its nonreliance on sophisticated mathematics. Low order polynomial regression can effectively model compact floor and ceiling effects, local linearity, and prevent inferring the presence of spurious interaction effects between distinct predictors when none are present. We also argue that the case against polynomial regression is largely specious, relying on either misconceptions around the method, strawman arguments, or historical artifacts.

Featured Image

Why is it important?

Polynomial regression is easily one of the most misunderstood and underutilized techniques in the statistical toolbox. Inclusion of polynomial forms can drastically improve model validity and fit, improving a model's explanatory power and yielding better predictions over wider population domains. This is true regardless of whether you are interested in an ordinary linear regression model or a complex generalized linear mixed model (GLMM).

Perspectives

If you're not considering polynomial forms of your predictors when building a parametric regression model, you cannot justifiably make claims of "model validity" or "superior model fit."

Edward Kroc
University of British Columbia

Read the Original

This page is a summary of: The case for the curve: Parametric regression with second- and third-order polynomial functions of predictors should be routine., Psychological Methods, December 2023, American Psychological Association (APA),
DOI: 10.1037/met0000629.
You can read the full text:

Read

Resources

Contributors

The following have contributed to this page