• Entry
• Reader's guide
• Entries A-Z
• Subject index

### Posterior Distribution

In Bayesian analysis, the posterior distribution, or posterior, is the distribution of a set of unknown parameters, latent variables, or otherwise missing variables of interest, conditional on the current data. The posterior distribution uses the current data to update previous knowledge, called a prior, about that parameter. A posterior distribution, p(θ|x), is derived using Bayes’s theorem

$p\left(\theta \text{\hspace{0.17em}}|x\right)=\frac{p\left(x\text{\hspace{0.17em}}|\theta \right)p\left(\theta \right)}{p\left(x\right)}=\frac{p\left(x\text{\hspace{0.17em}}|\theta \right)p\left(\theta \right)}{\int p\left(x\text{\hspace{0.17em}}|\theta \right)p\left(\theta \right)d\theta },$

where θ is the unknown parameter(s) and x is the current data. The probability of the data given the parameter p(x|θ) is the likelihood L(θ|x). The prior distribution, p(θ), is user specified to represent prior knowledge about the unknown parameter(s). The last piece of Bayes’s theorem, the marginal distribution of data, p(x), is computed using the likelihood and the prior. The distribution of the posterior is determined by the ...

• Loading...
locked icon

## Get a 30 day FREE TRIAL

• Watch videos from a variety of sources bringing classroom topics to life
• Read modern, diverse business cases
• Explore hundreds of books and reference titles