Given a random vector, the probability distribution of all its components, considered together, is called joint distribution, while the probability distribution of one of its components, considered in isolation, is called marginal distribution.
The following is a more precise definition.
Marginal distribution functions play an important role in the characterization of independence between random variables: two random variables are independent if and only if their joint distribution function is equal to the product of their marginal distribution functions (see the lecture entitled Independent random variables).
Example Let and be two random variables having marginal distribution functionsand joint distribution functionIt is easy to check that for any and , which implies that and are independent.
A more detailed discussion of the marginal distribution function can be found in the lecture entitled Random vectors.
Previous entry: Loss function
Next entry: Marginal probability density function
Most of the learning materials found on this website are now available in a traditional textbook format.