Section 5.8: Random Vectors and Matrices

  1. Explain how a random vector or random matrix is constructed from random variables.
  2. Define the expectation of a random vector, and compute the expectation of random vectors arising in simple linear regression.
  3. Compute inner and outer products between two vectors.
  4. Define the variance-covariance matrix of a random vector, and compute the variance-covariance matrices of random vectors arising in simple linear regression.
  5. Compute the expectation and variance-covariance matrix of a matrix-vector product \(\mathbf{A} \mathbf{Y}\) where \(\mathbf{A}\) is a constant matrix and \(\mathbf{Y}\) is a random vector with mean \(\boldsymbol{\mu}_{\mathbf{Y}}\) and variance-covariance matrix \(\boldsymbol{\Sigma}_{\mathbf{Y}}\).

Section 5.9: Simple Linear Regression Model in Matrix Terms

  1. State the Simple Linear Regression model as a matrix-vector equation.
  2. State the assumptions of the Simple Linear Regression model using matrix-vector notation.

Section 5.10: Least Squares Estimation of Regression Parameters

  1. Relate minimizing the quadratic function \(L(b) = (y - xb)^{2}\) using scalar calculus to minimizing the quadratic form \(L(\mathbf{b}) = (\mathbf{y} - \mathbf{X} \mathbf{b})^{T} (\mathbf{y} - \mathbf{X} \mathbf{b})\) using matrix calculus.
  2. State the least squares estimator \(\mathbf{b}\) for a simple linear regression model via matrix algebra involving the design matrix \(\mathbf{X}\) and the vector of responses \(\mathbf{Y}\).

Section 5.13: Inferences in Regression Analysis

  1. Derive the mean and variance-covariance matrix of the least squares estimator \(\mathbf{b}\) under the Simple Linear Regression model assumptions.