Chapter 7: Multivariate Distributions
- State the definition of a continuous random vector.
- Define the joint probability density function \(f(x, y)\) for a bivariate random vector \((X, Y)\).
- Use the joint probability density function \(f(x, y)\) to find the probability that \((X, Y)\) falls in some query set.
- State the two properties that are sufficient for a function \(f(x, y)\) to be a joint probability density function.
- Define the joint cumulative distribution function \(F(x, y)\) for a bivariate continuous random vector \((X, Y)\).
- Define the marginal probability density functions \(f_{X}\) and \(f_{Y}\) for a bivariate continuous random vector \((X, Y)\).
- Explain why marginal probability density functions are called “marginal.”
- Compute marginal probability density functions from a joint probability density function.
- Compute the joint probability density function of a bivariate random vector \((X, Y)\) from its joint cumulative distribution function.
- Define the expected value \(E[g(X, Y)]\) of a transformation \(g\) of a continuous random vector \((X, Y)\), and compute the expected value given a probability density function for the random vector.