StatlectThe Digital Textbook

# F distribution

A random variable has an F distribution if it can be written as a ratio:between a Chi-square random variable with degrees of freedom and a Chi-square random variable , independent of , with degrees of freedom (where each of the two random variables has been divided by its degrees of freedom). The importance of the F distribution stems from the fact that ratios of this kind are encountered very often in statistics.

## Definition

F random variables are characterized as follows:

Definition Let be an absolutely continuous random variable. Let its support be the set of positive real numbers:Let . We say that has an F distribution with and degrees of freedom if its probability density function is:where is a constant:and is the Beta function.

To better understand the F distribution, you can have a look at its density plots.

## Relation to the Gamma distribution

An F random variable can be written as a Gamma random variable with parameters and , where the parameter is equal to the reciprocal of another Gamma random variable, independent of the first one, with parameters and :

Proposition (Integral representation) The probability density function of can be written as:where:

1. is the probability density function of a Gamma random variable with parameters and :

2. is the probability density function of a Gamma random variable with parameters and :

Proof

We need to prove that:where:andLet us start from the integrand function: where and is the probability density function of a random variable having a Gamma distribution with parameters and . Therefore:

## Relation to the Chi-square distribution

In the introduction, we have stated (without a proof) that a random variable has an F distribution with and degrees of freedom if it can be written as a ratio:where:

1. is a Chi-square random variable with degrees of freedom

2. is a Chi-square random variable, independent of , with degrees of freedom.

The statement can be proved as follows:

Proof

This statement is equivalent to the statement proved above (relation to the Gamma distribution): can be thought of as a Gamma random variable with parameters and , where the parameter is equal to the reciprocal of another Gamma random variable , independent of the first one, with parameters and . The equivalence can be proved as follows.

Since a Gamma random variable with parameters and is just the product between the ratio and a Chi-square random variable with degrees of freedom (see the lecture entitled Gamma distribution), we can write: where is a Chi-square random variable with degrees of freedom. Now, we know that is equal to the reciprocal of another Gamma random variable , independent of , with parameters and . Therefore:But a Gamma random variable with parameters and is just the product between the ratio and a Chi-square random variable with degrees of freedom. Therefore, we can write:

## Expected value

The expected value of an F random variable is well-defined only for and it is equal to:

Proof

It can be derived thanks to the integral representation of the Beta function:

In the above derivation we have used the properties of the Gamma function and the Beta function. It is also clear that the expected value is well-defined only when : when , the above improper integrals do not converge (both arguments of the Beta function must be strictly positive).

## Variance

The variance of an F random variable is well-defined only for and it is equal to:

Proof

It can be derived thanks to the usual variance formula () and to the integral representation of the Beta function:

In the above derivation we have used the properties of the Gamma function and the Beta function. It is also clear that the expected value is well-defined only when : when , the above improper integrals do not converge (both arguments of the Beta function must be strictly positive).

## Higher moments

The -th moment of an F random variable is well-defined only for and it is equal to:

Proof

Using the definition of moment:

In the above derivation we have used the properties of the Gamma function and the Beta function. It is also clear that the expected value is well-defined only when : when , the above improper integrals do not converge (both arguments of the Beta function must be strictly positive).

## Moment generating function

An F random variable does not possess a moment generating function.

Proof

When a random variable possesses a moment generating function, then the -th moment of exists and is finite for any . But we have proved above that the -th moment of exists only for . Therefore, can not have a moment generating function.

## Characteristic function

There is no simple expression for the characteristic function of the F distribution. It can be expressed in terms of the Confluent hypergeometric function of the second kind (a solution of a certain differential equation, called confluent hypergeometric differential equation). The interested reader can consult Phillips (1982).

## Distribution function

The distribution function of an F random variable is:where the integralis known as incomplete Beta function and is usually computed numerically with the help of a computer algorithm.

Proof

This is proved as follows:

## Solved exercises

Below you can find some exercises with explained solutions:

## References

Phillips, P. C. B. (1982) The true characteristic function of the F distribution, Biometrika, 69, 261-264.

The book

Most learning materials found on this website are now available in a traditional textbook format.

Glossary entries
Share