Chapter 6: Some Continuous Distributions
- State the probability density and cumulative distribution functions for a continuous uniform random variable \(X\) on \((a, b)\), and state its mean and variance in terms of \((a, b)\).
- Compare and constrast the moments of the discrete and continuous uniform distributions.
- State the probability density, cumulative distribution, and moment generating functions for an exponential random variable \(X\) with parameter \(\lambda\), and state its mean and variance.
- Use the tabular method to perform integration by parts.
- State the memoryless property for the exponential distribution, and recognize how this property can simplify computations that involve self-conditional conditioning.
- State the probability density and moment generating functions for a Gamma random variable \(X\) with parameters \((\alpha > 0, \lambda > 0)\), and state its mean and variance.
- State the definition of the Gamma function \(\Gamma(z)\), and evaluate the Gamma function for positive integers.
- Relate the Gamma function to the factorial function.
- Give a constructive definition of a Gamma random variable in terms of exponential random variables, and use this constructive definition to compute the mean, variance, and moment generating function of a Gamma random variable.
- Relate the cumulative distribution function for a Gamma random variable with \(\alpha = n \in \mathbb{N}_{> 0}\) to the survival function for a Poisson random variable with a related parameter.
- Explain the correspondence between a Bernoulli process, a geometric random variable, and a negative binomial random variable for events occurring on a discrete clock and a Poisson process, an exponential random variable, and a Gamma random variable for events occurring on a continuous clock.