Chapter 6: Some Continuous Distributions

  1. State the probability density and cumulative distribution functions for a continuous uniform random variable \(X\) on \((a, b)\), and state its mean and variance in terms of \((a, b)\).
  2. Compare and constrast the moments of the discrete and continuous uniform distributions.
  3. State the probability density, cumulative distribution, and moment generating functions for an exponential random variable \(X\) with parameter \(\lambda\), and state its mean and variance.
  4. Use the tabular method to perform integration by parts.
  5. State the memoryless property for the exponential distribution, and recognize how this property can simplify computations that involve self-conditional conditioning.
  6. State the probability density and moment generating functions for a Gamma random variable \(X\) with parameters \((\alpha > 0, \lambda > 0)\), and state its mean and variance.
  7. State the definition of the Gamma function \(\Gamma(z)\), and evaluate the Gamma function for positive integers.
  8. Relate the Gamma function to the factorial function.
  9. Give a constructive definition of a Gamma random variable in terms of exponential random variables, and use this constructive definition to compute the mean, variance, and moment generating function of a Gamma random variable.
  10. Relate the cumulative distribution function for a Gamma random variable with \(\alpha = n \in \mathbb{N}_{> 0}\) to the survival function for a Poisson random variable with a related parameter.
  11. Explain the correspondence between a Bernoulli process, a geometric random variable, and a negative binomial random variable for events occurring on a discrete clock and a Poisson process, an exponential random variable, and a Gamma random variable for events occurring on a continuous clock.