08. Joint Distribution
Definition¶
Given 2 Random Variables and , their joint distribution is the probability function for all possible values of and respectively.
Example:
Balls numbered are put in a jar. Two are selected without replacement. Let number on the first selected ball, number on the second selected ball.
The joint dist. function can be written as a table:
X\Y | 1 | 2 | 3 |
---|---|---|---|
1 | 0 | ||
2 | 0 | ||
3 | 0 |
The upper right square represents , and by the table, we can see that it is equal to
The sum of all probabilities in a joint distribution function must be 1.
In case that was too easy to understand, here’s the same statement in MathSpeak:
\(\)
The expected value of is equal to the sum of a…
Ok I tried, but I can’t write this in English, here’s the algorithm to do it:
Python | |
---|---|
And here’s the MathSpeak version:
\(\)
Given the joint distribution function, we can find and for any or .
In the above example,
Marginal Probabilities¶
In general:
the sum of entries in row in the table.
the sum of entries in col in the table.
These probabilities are called marginal probabilities and are typically written in the margin of the joint dist. table.
The sum of marginal probabilities for any random variable is always 1.
The above table, with added marginal probabilities:
X\Y | 1 | 2 | 3 | |
---|---|---|---|---|
1 | 0 | |||
2 | 0 | |||
3 | 0 | |||
X,Y are independent if and only if their joint distribution table forms a multiplication table, where each cell is equal to the product of the relevant marginal probabilities.
The above table, therefore, is not independent.
If a 0 appears anywhere inside a joint dist. table, then must be dependent, because marginal probabilities are never zero.