Chapter 7: Multivariate Distributions

  1. State the law of total expectation (aka iterated expectations).
  2. Explain why \(M_{Y \mid X} = \mu_{Y \mid X}(X)\) is a random variable, and describe ‘what remains random.’
  3. Use iterated expectations to determine the expected value of \(Y\) when we know how \(Y \mid X = x\) and \(X\) are distributed.
  4. State the law of total variation.
  5. Explain why \(V_{Y \mid X} = \sigma^{2}_{Y \mid X}(X)\) is a random variable, and describe ‘what remains random.’
  6. Define the covariance between two random variables \(X\) and \(Y\) in terms of an expectation.
  7. State the ‘variance shortcut’ for computing the covariance between two random variables \(X\) and \(Y\) in terms of an expectation.
  8. Compute the covariance between two discrete (continuous) random variables.
  9. Explain what the covariance characterizes about the distribution of a random vector \((X, Y)\).
  10. State the covariance between two random variables \(X\) and \(Y\) when \(X\) and \(Y\) are independent.
  11. Answer: does zero covariance imply independence?
  12. Determine \(\text{Var}(a X + b Y)\) when \(X\) and \(Y\) are not independent.
  13. Define the (Pearson) correlation between two random variables \(X\) and \(Y\) in terms of the covariance and variances of \(X\) and \(Y\).
  14. Define the mean vector \(\mu_{\mathbf{X}}\) and variance-covariance matrix \(\mathbf{V}\) for a random vector \(\mathbf{X} = (X_{1}, X_{2}, \ldots, X_{d})\).
  15. Use the variance-covariance matrix \(\mathbf{V}\) for a random vector \(\mathbf{X} = (X_{1}, X_{2}, \ldots, X_{d})\) to determine the variance of a linear combination of \(\mathbf{X}\) using linear algebra.