Section 5.8: Random Vectors and Matrices
- Explain how a random vector or random matrix is constructed from random variables.
- Define the expectation of a random vector, and compute the expectation of random vectors arising in simple linear regression.
- Compute inner and outer products between two vectors.
- Define the variance-covariance matrix of a random vector, and compute the variance-covariance matrices of random vectors arising in simple linear regression.
- Compute the expectation and variance-covariance matrix of a matrix-vector product \(\mathbf{A} \mathbf{Y}\) where \(\mathbf{A}\) is a constant matrix and \(\mathbf{Y}\) is a random vector with mean \(\boldsymbol{\mu}_{\mathbf{Y}}\) and variance-covariance matrix \(\boldsymbol{\Sigma}_{\mathbf{Y}}\).
Section 5.9: Simple Linear Regression Model in Matrix Terms
- State the Simple Linear Regression model as a matrix-vector equation.
- State the assumptions of the Simple Linear Regression model using matrix-vector notation.
Section 5.10: Least Squares Estimation of Regression Parameters
- Relate minimizing the quadratic function \(L(b) = (y - xb)^{2}\) using scalar calculus to minimizing the quadratic form \(L(\mathbf{b}) = (\mathbf{y} - \mathbf{X} \mathbf{b})^{T} (\mathbf{y} - \mathbf{X} \mathbf{b})\) using matrix calculus.
- State the least squares estimator \(\mathbf{b}\) for a simple linear regression model via matrix algebra involving the design matrix \(\mathbf{X}\) and the vector of responses \(\mathbf{Y}\).
Section 5.13: Inferences in Regression Analysis
- Derive the mean and variance-covariance matrix of the least squares estimator \(\mathbf{b}\) under the Simple Linear Regression model assumptions.