# Marginal probability mass function

Consider a discrete random vector, that is, a vector whose entries are discrete random variables. When one of these entries is taken in isolation, its distribution can be characterized in terms of its probability mass function. This is called marginal probability mass function, in order to distinguish it from the joint probability mass function, which is instead used to characterize the joint distribution of all the entries of the random vector considered together.

## Definition

The following is a more formal definition.

Definition Let be discrete random variables forming a random vector. Then, for each , the probability mass function of the random variable , denoted by , is called marginal probability mass function.

Remember that the probability mass function is a function such thatwhere is the probability that will be equal to .

By contrast, the joint probability mass function of the vector is a function such thatwhere is the probability that will be equal to , simultaneously for all .

## How to derive it

Denote by the support of (i.e., the set of all values it can take). The marginal probability mass function of is obtained from the joint probability mass function as follows:where the sum is over the setIn other words, the marginal probability mass function of at the point is obtained by summing the joint probability mass function over all the vectors that belong to the support and are such that their -th component is equal to .

## Example

Let be a random vector with supportand joint probability mass function

The marginal probability mass function of evaluated at the point is

When evaluated at the point it is

For all the other points, it is equal to zero. Therefore, we have

## More details

A more detailed discussion of the marginal probability mass function can be found in the lecture entitled Random vectors.

Previous entry: Marginal probability density function

Next entry: Mean