Chapter 7: Multivariate Distributions

  1. State the definition of a continuous random vector.
  2. Define the joint probability density function \(f(x, y)\) for a bivariate random vector \((X, Y)\).
  3. Use the joint probability density function \(f(x, y)\) to find the probability that \((X, Y)\) falls in some query set.
  4. State the two properties that are sufficient for a function \(f(x, y)\) to be a joint probability density function.
  5. Define the joint cumulative distribution function \(F(x, y)\) for a bivariate continuous random vector \((X, Y)\).
  6. Define the marginal probability density functions \(f_{X}\) and \(f_{Y}\) for a bivariate continuous random vector \((X, Y)\).
  7. Explain why marginal probability density functions are called “marginal.”
  8. Compute marginal probability density functions from a joint probability density function.
  9. Compute the joint probability density function of a bivariate random vector \((X, Y)\) from its joint cumulative distribution function.
  10. Define the expected value \(E[g(X, Y)]\) of a transformation \(g\) of a continuous random vector \((X, Y)\), and compute the expected value given a probability density function for the random vector.