Chapter 7: Multivariate Distributions
- Define the joint moment generating function for a random vector \(\mathbf{X}\), and compare it to the moment generating function of a random variable \(X\).
- Compute non-central product moments from a joint moment generating function.
- Determine marginal moment generating functions from a joint moment generating function.
- Recognize that the expectation \(E[g(X) h(Y)]\) factors, for any \(g\) and \(h\), when \(X\) and \(Y\) are independent.
- Recognize that joint moment generating functions factor when components of the governing random vector are independent.
- Use the joint moment generating function of a random vector \(\mathbf{X}\) to determine the moment generating function of a linear combination of the components of \(\mathbf{X}\).
- Relate the probability density function for a Gaussian random vector to the probability density function for a Gaussian random variable.
- Recognize that the linear combination of Gaussian random variables is itself Gaussian, and determine the distribution of the linear combination given information about the first and second moments of the random variables.
- Evaluate probabilities such at \(P(a X + b Y > c)\) when \(X\) and \(Y\) are Gaussian random variables with known first and second moments.