Chapter 7: Multivariate Distributions
- State the law of total expectation (aka iterated expectations).
- Explain why \(M_{Y \mid X} = \mu_{Y \mid X}(X)\) is a random variable, and describe ‘what remains random.’
- Use iterated expectations to determine the expected value of \(Y\) when we know how \(Y \mid X = x\) and \(X\) are distributed.
- State the law of total variation.
- Explain why \(V_{Y \mid X} = \sigma^{2}_{Y \mid X}(X)\) is a random variable, and describe ‘what remains random.’
- Define the covariance between two random variables \(X\) and \(Y\) in terms of an expectation.
- State the ‘variance shortcut’ for computing the covariance between two random variables \(X\) and \(Y\) in terms of an expectation.
- Compute the covariance between two discrete (continuous) random variables.
- Explain what the covariance characterizes about the distribution of a random vector \((X, Y)\).
- State the covariance between two random variables \(X\) and \(Y\) when \(X\) and \(Y\) are independent.
- Answer: does zero covariance imply independence?
- Determine \(\text{Var}(a X + b Y)\) when \(X\) and \(Y\) are not independent.
- Define the (Pearson) correlation between two random variables \(X\) and \(Y\) in terms of the covariance and variances of \(X\) and \(Y\).
- Define the mean vector \(\mu_{\mathbf{X}}\) and variance-covariance matrix \(\mathbf{V}\) for a random vector \(\mathbf{X} = (X_{1}, X_{2}, \ldots, X_{d})\).
- Use the variance-covariance matrix \(\mathbf{V}\) for a random vector \(\mathbf{X} = (X_{1}, X_{2}, \ldots, X_{d})\) to determine the variance of a linear combination of \(\mathbf{X}\) using linear algebra.