Unlocking the Power of Joint Distributions - How to Analyze Multiple Random Variables
Table of Contents
The concept of joint distribution is useful when studying the outcomes and effects of multiple random variables in statistics. Joint distribution allows generalizing probability theory to the multivariate case. Let me paint a story for you.
Joint Distributions
Today, the weather is nice. Its a fresh summer morning. You’re out at a restaurant having breakfast with your in-laws and you want to impress. You’re such a nice person, you think to yourself. You offer to pay the tip. You are aware that your wallet contains 3 twenty-dollar bills, 2 fifty-dollar bills, and 5 ten-dollar bills, but it hurts your heart too much to look. You decide on selecting 3 bills, but want to leave it up to the divine spirits of life to dictate which ones exactly.
This is where probability comes into the picture. We are going to formalize this scenario. We could choose to model this problem with a random variable tracking how much money we tip, but we want to be a bit more fine grained. We want to track how many fifty-dollar bills and ten-dollar bills we give up. Those hold sentimental value, for some reason. So, we use two random variables in our analysis.
This problem is ripe for a joint probability distribution.
Like we said, we have 2 fifty-dollar, 5 ten-dollar, and 3 twenty-dollar bills. Now we select 3 bills at random and let \(X\) be the number of fifty-dollar and \(Y\) the number of ten-dollar bills that are selected.
Definition:
Joint Discrete Random Variables
Let
\(X, Y\)
be discrete random variables. Then their joint probability mass function is
\(p(x, y) := P(X = x, Y = y)\)
.
We start by determining the joint probability mass function of \((x, y)\) .
Sweet! Now we have a function that models our problem! Lets iterate through all the possible values of \(X, Y\) . Here they are:
\(Y = 0\) | \(Y = 1\) | \(Y = 2\) | \(Y = 3\) | |
---|---|---|---|---|
\(X = 0\) | \(\frac{1}{120}\) | \(\frac{15}{120}\) | \(\frac{30}{120}\) | \(\frac{10}{120}\) |
\(X = 1\) | \(\frac{6}{120}\) | \(\frac{30}{120}\) | \(\frac{20}{120}\) | \(0\) |
\(X = 2\) | \(\frac{3}{120}\) | \(\frac{5}{120}\) | \(0\) | \(0\) |
25% chance that you give up a single fifty and ten-dollar bill, respectively. Huh, maybe we should have just been deliberate about choosing the bills…
As we are reaching into our wallet, time slows down to a pause. We think to ourselves, ‘Wow, we are doing this huh. Oh no, not the fifty…’. Quickly, our mind thinks of marginal probabilities. Sweat is starting to form. We want to compute the probability of giving up \(X\) fifty-dollar bills.
Definition: Discrete Random Variables Marginal Density
The marginal probability mass functions of \(X, Y\) are obtained by:
Using the definition of Marginal Density, we can evaluate this probability.
Surely we can handle losing two fifty-dollar bills. We’re generous! That’s totally possible. In fact, I hope we do. You feel better with yourself. You compute the probability as follows:
Wait, that’s kinda unlikely. If only we were given a chance to be nice, we would. Dang!
Some other cool (and related) calculations
Lets relate the probability of having selected more fifty-dollar bills than ten-dollar bills. So,
\(P(X \geq Y)\) :
Lastly, what about providing two fifty-dollar bills, provided the condition that we selected more fifty-dollar bills than ten-dollar bills.
\(P(X = 2 \| X \geq Y) \) :
Independence of Random Variables
In some cases, the probability distribution of one random variable will not be affected by the distribution of another random variable defined on the same sample space. We refer to those distributions as being independent of eachother.
Definition: Discrete and Independent Random Variables
Random variables \(X \subset A, Y\subset B\) are independent if for all intervals \(A, B \subset \mathbb{R}\)
If \(X, Y\) are discrete, then we say that they are independent if:
So in the motivating example earlier, can we say that \(X, Y\) are independent?
Nope. In case you don’t believe me, consider the following:
Continuous Jointly Distributed Random Variables
This same analysis can be carried over into situations of continuity.
Definition: Jointly Continuous Pair of Random Variables
\((x, y)\) is a jointly continuous pair of random variables if there exists a joint density \(f(x, y) \geq 0\) so that
Definition: Continuous Random Variable Marginal Density If the joint density of (x, y) is \(f\) , then the two marginal densities, which are densities of \(X\) and \(Y\) , are computed by integrating the other variables
Definition: Continuous Random Variable Independence In the continuous case, independence translates into \(\forall A, B \subseteq \mathbb{R}\) ,