Chapter 7: Multivariate Distributions

  1. Define the joint moment generating function for a random vector \(\mathbf{X}\), and compare it to the moment generating function of a random variable \(X\).
  2. Compute non-central product moments from a joint moment generating function.
  3. Determine marginal moment generating functions from a joint moment generating function.
  4. Recognize that the expectation \(E[g(X) h(Y)]\) factors, for any \(g\) and \(h\), when \(X\) and \(Y\) are independent.
  5. Recognize that joint moment generating functions factor when components of the governing random vector are independent.
  6. Use the joint moment generating function of a random vector \(\mathbf{X}\) to determine the moment generating function of a linear combination of the components of \(\mathbf{X}\).
  7. Relate the probability density function for a Gaussian random vector to the probability density function for a Gaussian random variable.
  8. Recognize that the linear combination of Gaussian random variables is itself Gaussian, and determine the distribution of the linear combination given information about the first and second moments of the random variables.
  9. Evaluate probabilities such at \(P(a X + b Y > c)\) when \(X\) and \(Y\) are Gaussian random variables with known first and second moments.