16

Advanced Regression Models

In this chapter we extend the simple linear regression concepts that were introduced in Chapter 10. The first quite natural idea is building a linear regression model involving more than one regressor. Finding the parameters by ordinary least squares (OLS) is a rather straightforward exercise, as we see in Section 16.1. What is much less straightforward is the statistical side of the coin, since the presence of multiple variables introduces some new issues. In Section 16.2 we consider the problems of testing a multiple regression model, selecting regressor variables, and assessing forecasting uncertainty. We do so for the simpler case of nonstochastic regressors and under restrictive assumptions about the errors, i.e., independence, homoskedasticity, and normality. Even within this limited framework we may appreciate issues like bias from omitted variables and multicollinearity. An understanding of their impact is essential from the users' point of view. In fact, given the computational power of statistical software, it is tempting to build a huge model encompassing a rich set of explanatory variables. In practice, this may be a dangerous route, and a sound parsimony principle should be always kept in mind.

Multiple linear regression models often include categorical regressors, which are typically accounted for by dummy, or binary, variables. When it is the regressed variable that is categorical, a crude application of regression modeling may lead ...

Get Quantitative Methods: An Introduction for Business Management now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.