Search for probability and statistics terms on Statlect
StatLect

Gamma distribution

by , PhD

The Gamma distribution is a generalization of the Chi-square distribution.

It plays a fundamental role in statistics because estimators of variance often have a Gamma distribution.

Table of Contents

Caveat

There are several equivalent parametrizations of the Gamma distribution.

We present one that is particularly convenient in Bayesian applications, and we discuss how it maps to alternative parametrizations.

In our presentation, a Gamma random variable X has two parameters:

How it arises

Let [eq3] be independent normal random variables with zero mean and unit variance.

The variable [eq4]has a Chi-square distribution with n degrees of freedom.

If $h$ is a strictly positive constant, then the random variable X defined as [eq5]has a Gamma distribution with parameters n and $h$.

Therefore, a Gamma variable X with parameters n and $h$ can also be written as the sum of the squares of n independent normals having zero mean and variance equal to $h/n$:[eq6]

In general, the sum of independent squared normal variables that have zero mean and arbitrary variance has a Gamma distribution.

Yet another way to see X is as the sample variance of n normal variables with zero mean and variance $h$:[eq7]

Infographic summarizing the relations among the normal, chi-square and gamma distributions.

Definition

Gamma random variables are characterized as follows.

Definition Let X be a continuous random variable. Let its support be the set of positive real numbers:[eq8]Let [eq9]. We say that X has a Gamma distribution with parameters n and $h$ if and only if its probability density function is[eq10]where $c$ is a constant:[eq11]and [eq12] is the Gamma function.

To better understand the Gamma distribution, you can have a look at its density plots.

Alternative parametrizations

Here we discuss two alternative parametrizations reported on Wikipedia. You can safely skip this section on a first reading.

The first alternative parametrization is obtained by setting $k=n/2$ and $	heta =2h/n$, under which:

The second alternative parametrization is obtained by setting $lpha =n/2$ and [eq16], under which:

Although these two parametrizations yield more compact expressions for the density, the one we present often generates more readable results when it is used in Bayesian statistics and in variance estimation.

Expected value

The expected value of a Gamma random variable X is[eq20]

Proof

The mean can be derived as follows:[eq21]

Variance

The variance of a Gamma random variable X is[eq22]

Proof

It can be derived thanks to the usual variance formula ([eq23]):[eq24]

Moment generating function

The moment generating function of a Gamma random variable X is defined for any $frac{n}{2h}$:[eq25]

Proof

By using the definition of moment generating function, we obtain[eq26]where the integral equals 1 because it is the integral of the probability density function of a Gamma random variable with parameters n and [eq27]. Thus,[eq28]Of course, the above integrals converge only if [eq29], i.e. only if $frac{n}{2h}$. Therefore, the moment generating function of a Gamma random variable exists for all $frac{n}{2h} $.

Characteristic function

The characteristic function of a Gamma random variable X is[eq30]

Proof

It can be derived by using the definition of characteristic function and a Taylor series expansion:[eq31]

Distribution function

The distribution function of a Gamma random variable is[eq32]where the function[eq33]is called lower incomplete Gamma function and is usually evaluated using specialized computer algorithms.

Proof

This is proved as follows:[eq34]

More details

In the following subsections you can find more details about the Gamma distribution.

The Gamma distribution is a scaled Chi-square distribution

If a variable X has the Gamma distribution with parameters n and $h$, then[eq35]where $chi _{n}^{2}$ has a Chi-square distribution with n degrees of freedom.

Proof

For notational simplicity, denote $chi _{n}^{2}$ by Z in what follows. Note that [eq36]is a strictly increasing function of Z, since $frac{h}{n}$ is strictly positive. Therefore, we can use the formula for the density of an increasing function of a continuous variable:[eq37]The density function of a Chi-square random variable with n degrees of freedom is[eq38]where [eq39]Therefore,[eq40]which is the density of a Gamma distribution with parameters n and $h$.

Thus, the Chi-square distribution is a special case of the Gamma distribution because, when $h=n$, we have[eq41]

In other words, a Gamma distribution with parameters n and $h=n$ is just a Chi square distribution with n degrees of freedom.

A Gamma random variable times a strictly positive constant is a Gamma random variable

By multiplying a Gamma random variable by a strictly positive constant, one obtains another Gamma random variable.

If X is a Gamma random variable with parameters n and $h$, then the random variable Y defined as[eq42]has a Gamma distribution with parameters n and $ch$.

Proof

This can be easily seen using the result from the previous subsection:[eq43]where $chi _{n}^{2}$ has a Chi-square distribution with n degrees of freedom. Therefore,[eq44]In other words, Y is equal to a Chi-square random variable with n degrees of freedom, divided by n and multiplied by $ch$. Therefore, it has a Gamma distribution with parameters n and $ch$.

A Gamma random variable is a sum of squared normal random variables

In the lecture on the Chi-square distribution, we have explained that a Chi-square random variable $chi _{n}^{2}$ with $n $ degrees of freedom (n integer) can be written as a sum of squares of n independent normal random variables $Z_{1}$, ...,$Z_{n}$ having mean 0 and variance 1:[eq45]

In the previous subsections we have seen that a variable X having a Gamma distribution with parameters n and $h$ can be written as[eq46]where Z has a Chi-square distribution with n degrees of freedom.

Putting these two things together, we obtain[eq47]where we have defined[eq48]

But the variables $Y_{i}$ are normal random variables with mean 0 and variance $frac{h}{n}$.

Therefore, a Gamma random variable with parameters n and $h$ can be seen as a sum of squares of n independent normal random variables having zero mean and variance $h/n$.

Density plots

We now present some plots that help us to understand how the shape of the Gamma distribution changes when its parameters are changed.

Plot 1 - Same mean but different degrees of freedom

The following plot contains two lines:

Because $h=3$ in both cases, the two distributions have the same mean.

However, by increasing n from $6$ to $8$, the shape of the distribution changes. The more we increase the degrees of freedom, the more the pdf resembles that of a normal distribution.

The thin vertical lines indicate the means of the two distributions.

Gamma density plot 1

Plot 2 - Different means but same number of degrees of freedom

In this plot:

Increasing the parameter $h$ changes the mean of the distribution from $2$ to $4$.

However, the two distributions have the same number of degrees of freedom ($n=6$). Therefore, they have the same shape. One is the "stretched version of the other". It would look exactly the same on a different scale.

Gamma density plot 2

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Let X_1 and X_2 be two independent Chi-square random variables having $3$ and $5$ degrees of freedom respectively.

Consider the following random variables:[eq49]

What distribution do they have?

Solution

Being multiples of Chi-square random variables, the variables $Y_{1}$, $Y_{2} $ and $Y_{3}$ all have a Gamma distribution. The random variable $X_{1} $ has $n=3$ degrees of freedom and the random variable $Y_{1}$ can be written as[eq50]where $h=6$. Therefore $Y_{1}$ has a Gamma distribution with parameters $n=3$ and $h=6$. The random variable X_2 has $n=5$ degrees of freedom and the random variable $Y_{2}$ can be written as[eq51]where $h=5/3$. Therefore $Y_{2}$ has a Gamma distribution with parameters $n=5$ and $h=5/3$. The random variable $X_{1}+X_{2}$ has a Chi-square distribution with $n=3+5=8$ degrees of freedom, because X_1 and X_2 are independent (see the lecture on the Chi-square distribution), and the random variable $Y_{3}$ can be written as[eq52]where $h=24$. Therefore $Y_{3}$ has a Gamma distribution with parameters $n=8 $ and $h=24$.

Exercise 2

Let X be a random variable having a Gamma distribution with parameters $n=4 $ and $h=2$.

Define the following random variables:[eq53]

What distribution do these variables have?

Solution

Multiplying a Gamma random variable by a strictly positive constant one still obtains a Gamma random variable. In particular, the random variable $Y_{1}$ is a Gamma random variable with parameters $n=4$ and [eq54] The random variable $Y_{2}$ is a Gamma random variable with parameters $n=4$ and [eq55] The random variable $Y_{3}$ is a Gamma random variable with parameters $n=4$ and [eq56]The random variable $Y_{3}$ is also a Chi-square random variable with $4$ degrees of freedom (remember that a Gamma random variable with parameters n and $h$ is also a Chi-square random variable when $n=h$).

Exercise 3

Let X_1, X_2 and $X_{3}$ be mutually independent normal random variables having mean $mu =0$ and variance $sigma ^{2}=3$.

Consider the random variable[eq57]

What distribution does X have?

Solution

The random variable X can be written as [eq58]where $Z_{1}$, $Z_{2}$ and $Z_{3}$ are mutually independent standard normal random variables. The sum [eq59] has a Chi-square distribution with $3$ degrees of freedom (see the lecture entitled Chi-square distribution). Therefore X has a Gamma distribution with parameters $n=3$ and $h=18$.

How to cite

Please cite as:

Taboga, Marco (2021). "Gamma distribution", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/probability-distributions/gamma-distribution.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.