• Entry
• Entries A-Z
• Subject index

### Heteroskedasticity

One of the standard assumptions of the classical linear regression model

${y}_{i}={\mathrm{\beta }}_{0}+{\mathrm{\beta }}_{1}{x}_{i1}+{\mathrm{\beta }}_{2}{x}_{i2}+\cdots +{\mathrm{\beta }}_{k}{x}_{ik}+\mathrm{\epsilon };\phantom{\rule{0.25em}{0ex}}i=1\cdots N$

is that the variance of the error term $\left({\mathrm{\epsilon }}_{i}\right)$ is the same for all observations, that is $\mathrm{Var}\phantom{\rule{0.25em}{0ex}}\left(|{x}_{1i},\phantom{\rule{0.25em}{0ex}}{x}_{2i},\phantom{\rule{0.25em}{0ex}}\cdots \phantom{\rule{0.25em}{0ex}},\phantom{\rule{0.25em}{0ex}}{x}_{ki}\right)={\sigma }^{2}$. The assumption of a constant error variance is known as homoskedasticity and its failure is referred to as heteroskedasticity, or unequal variance. Heteroskedasticity is expressed as $\mathrm{Var}\phantom{\rule{0.25em}{0ex}}\left(|{x}_{1i},\phantom{\rule{0.25em}{0ex}}{x}_{2i},\phantom{\rule{0.25em}{0ex}}\cdots \phantom{\rule{0.25em}{0ex}},\phantom{\rule{0.25em}{0ex}}{x}_{ki}\right)={\sigma }_{i}^{2}$, where an i subscript on σ2 indicates that the variance of the error is no longer constant but may vary from observation to observation. Note, the alternate spellings homoscedasticity and heteroscedasticity are also commonly used. This entry first reviews when heteroskedasticity typically arises. The consequences of heteroskedastic errors are then discussed, followed by sections describing the detection of and solutions for heteroskedasticity.

Heteroskedasticity is often encountered when using cross-section data, when ...