This course will cover the probability theory necessary to solve problems on the Society of Actuaries' (SOA's) Exam P, the first exam in the credentialing process for both the SOA and the Casualty Actuarial Society (CAS). This exam covers material from a first course in calculus-based probability theory. A (non-exhaustive) list of those topics are given in the Topics, Notes, Readings section below. The course will alternate between lectures on topics and informal problem solving sessions. The problem solving sessions will be run by you, the students, with my assistance. A special emphasis will be given to solving questions in the cleverest way possible, since the 'right' or 'long' way may take too long given the structure of Exam P. In addition to covering the probability theory included on Exam P, we will learn about the career paths available to actuaries, and have guest speakers discuss their experiences as actuaries.

MA 220

Dr. David Darmon | ddarmon [at] monmouth.edu | |

Howard Hall 241 |

This is currently a *tentative* listing of topics, in order.

*Introduction:*What is an actuary? What is actuarial science? The credentialing process for becoming an actuary. Exam P. Tips and tricks for life-long learning.*Combinatorial Probability:*Finding the probability of an outcome by counting. Combinatorics. Permutations, combinations, multisets, and all of the other ways to assign \(n\) items to \(k\) slots. How combinatorics relates to sampling.*Probability:*Probability as a measure on a set. Set operations: union, intersection, difference, and so on. Venn diagrams for reasoning about probabilities. Sample spaces and events. Assigning probabilities to events, in all the ways.*Discrete random variables:*How to talk about discrete random variables: probability mass functions, cumulative distribution functions, expectations, and moment generating functions. Discrete random vectors as multidimensional generalizations of discrete random variables.*Some parametric discrete distributions:*Uniform. Bernoulli. Descendants of Bernoulli: binomial, negative binomial, geometric. Like binomial, but without replacement: hypergeometric. Poisson.*Continuous random variables:*How to talk about continuous random variables: probability density functions, cumulative distribution functions, expectations, and moment generating functions. Mixed-type random variables.*Some parametric continuous distributions:*Exponential. Normal. Gamma. Beta. Connections between families of random variables. The Russian inequalities. The law of large numbers. The central limit theorem.*Random vectors and multivariate distributions:*How to talk about random vectors: joint probability mass / density functions, joint cumulative distribution functions, and joint moment generating functions. Expectations of random vectors. Conditional random variables and conditional distributions.*More techniques in probability theory:*Transformations of random variables. Order statistics. More on the moment generating function. More on variances and covariances via linear algebra. More on conditioning arguments.

I will have office hours at the following four times each week:

Monday, 10:00—11:00 AM | Howard Hall 241 |

Tuesday, 03:00—04:00 PM | Howard Hall 241 |

Thursday, 10:00—11:00 AM | Howard Hall 241 |

Thursday, 01:30—02:30 PM | Howard Hall 241 |

I have an open-door policy during those times: you can show up unannounced. If you cannot make the scheduled office hours, please e-mail me about making an appointment.

If you are struggling with the problem sets, having difficulty with any concepts, or just want to chat, please visit me during my office hours. I am here to help.

- 50% for in-class presentation of problem solutions
- 20% for investigation into and reflection on actuarial science
- 15% for inter-class skills work
- 15% for inter-class review

- \([90, 100] \to \text{A}\)
- \([80, 90) \,\,\, \to \text{B}\)
- \([70, 80) \,\,\, \to \text{C}\)
- \([60, 70) \,\,\, \to \text{D}\)
- \([0, 60) \,\,\,\,\,\, \to \text{F}\)

During the informal problem solving sessions, you will spend the first 30 minutes working individually on a collection of problems related to material we have covered in the class to-date. I will then select three students (randomly and without replacement) to act as problem solving guides for that session. For the next 40 minutes, the selected students will alternate leading a class discussion on how to solve a problem of their choice from the collection of problems. These discussions are meant to be interactive amongst **all** students, with the **main presentation** coming from the current lead student.

Problem sets will be assigned at the end of every Thursday meeting, and listed in the Schedule section of this page. Problems sets are due at the beginning of the next class meeting. Problem sets will consist of **required** problems, as well as **suggested** problems.

**Note:** The solutions to each exercise in our textbook are listed in the back of the book, so you should be sure to write up how you arrived at an answer, not just the answer you arrived at. Do, however, use the provided answers to check your solutions.

One of the most important skills needed to master mathematics is memory. To do mathematics, you need to make connections between concepts stored in your long-term memory, and before you can do that, you need to store those memories in the first place. One of the best methods for strengthening long-term memory is **retrieval practice** (think flash cards) combined with **spaced repetition** (think reviewing flash cards on an intelligent schedule). This is the exact opposite of how many students study, which typically takes the form of browsing notes (and thus skipping over retrieving the information from their own memories) immediately before the information is needed (i.e. 'cramming'). Unfortunately, this is one of the worst ways to commit information to long-term memory, despite the fact that cramming *feels* effective in the short-term. Retrieval practice with spaced repetition is more effective than the browse-and-cram approach, takes less time, and is more enjoyable!

All of this is true, but even more so, in preparing for the SOA's Exam P. To solve the 30 questions on Exam P within the allotted 3 hours, you need to have a great deal of information at-the-ready: *e.g.* How is the sum of \(n\) iid Poisson random variables with rate parameters \(\lambda\) distributed? How do I get moments from a moment generating function? What is the Darth Vader rule for quickly evaluating the mean of random variable with non-negative support? Etc. Many of these shortcuts you can pick up, in the short term, by working problems. But for long-term retention, spaced retrieval practice will be both more efficient (less total time) and more effective (longer and stronger retention).

As part of inter-class review, you are required to regularly use Anki, and submit your Anki decks via eCampus. See below for details on Anki.

I will start out by providing you with Anki decks, but we will transition to you generating your own Anki decks. This will make the decks more personalized and more meaningful. In the process, you will (re-)learn how to typeset math using LaTeX.

See here for the instructions on submitting your Anki decks via eCampus.

The **required** textbook is:

- Leonard A. Asimow and Mark M. Maxwell,
*Probability & Statistics with Applications: A Problem Solving Text*, 2nd Edition (ACTEX Academic Series, 2015, ISBN: 9781625424723). Link to University Store

We will use Anki for spaced retrieval practice throughout the semester. Anki is open-source, free (as in both *gratis* and *libre*) software. You can download Anki to your personal computer from this link. If you have ever used flashcards, then Anki should be fairly intuitive. If you would like more details you can find Anki's User Manual here.

**Note:** Anki has both desktop and mobile phone variants. Please use the desktop variant.

- Prior to January 22, Lecture 0:
**Topics:**Spaced retrieval practice. Pre-class reflection.- Pre-class Assignments
- January 22, Lecture 1:
**Topics:**Introduction to class. Introduction to the Society of Actuaries' Exam P. "Pre-treatment" Exam. Study Hacks.**Sections:**NA- January 24, Lecture 2:
**Topics:**General rules of probability. Sets, sample spaces, and events. Correspondence between set operations and Venn diagrams. \( P(A \cap B) \). \( P(A \cup B) \). Conditional probability and statistical independence. Repeated factoring of probabilities. The law of total probability. Bayes's theorem.**Sections:**Chapter 2- January 29, Lecture 3:
**Topics:**General rules of probability. Problem solving session.**Sections:**[Chapter 2]- January 31, Lecture 4:
**Topics:**Combinatorial probability. Counting methods and their classification. Sampling.**Sections:**Chapter 1- February 5, Lecture 5:
**Topics:**Combinatorial probability. Problem solving session.**Sections:**[Chapter 1, Chapter 2]- February 7, Lecture 6:
**Topics:**Discrete random variables. The probability mass function. The cumulative distribution function. Expectation of functions of discrete random variables. Summaries of central tendency. Summaries of dispersion. Conditional expectations and variances, and their manipulation. Discrete random vectors and their joint distributions. Time permitting: probability generating functions.**Sections:**Chapter 3- February 12, Lecture 7:
**Topics:**Discrete random variables. Problem solving session.**Sections:**[Chapter 1, Chapter 3]- February 14, Lecture 8:
**Topics:**Some parametric discrete distributions. Uniform. Bernoulli. Descendants of Bernoulli: binomial, negative binomial, geometric. Like Binomial, but without replacement: hypergeometric. Poisson.**Sections:**Chapter 4- February 19, Lecture 9:
**Topics:**Some parametric discrete distributions. Problem solving session.**Sections:**[Chapter 1, Chapter 4]- February 21, Lecture 10:
**Topics:**Continuous random variables. The probability density function. The cumulative probability function. Expectations of functions of continuous random variables. Mixed-type random variables. Applications to insurance. Moment generating functions.**Sections:**Chapter 5- February 26, Lecture 11:
**Topics:**Continuous random variables. Problem solving session.**Sections:**[Chapter 1, Chapter 5]- February 28, Lecture 12:
**Topics:**Some parametric continuous distributions. Exponential. Normal. Gamma. Beta. Connections between families of random variables. Red scare: Markov's and Cheybshev's inequalities. The law of large numbers. The central limit theorem. Application of CLT to approximating a binomial random variable using a normal random variable.**Sections:**Chapter 6- March 5, Lecture 13:
**Topics:**Some parametric continuous distributions. Problem solving session.**Sections:**[Chapter 1, Chapter 6]- March 7, Lecture 14:
**Topics:**Random vectors and multivariate distributions. The joint probability mass / density function. The joint cumulative distribution function. Independence and factoring of joint probability functions. Conditional random variables and conditional distributions. Linear dependence: correlation and covariance. Using linear algebra to evaluate variances and covariances. Special cases: the multinomial and multivariate normal distributions. Moment generating functions for random vectors.**Sections:**Chapter 7- March 12, Lecture 15:
**Topics:**Random vectors and multivariate distributions. Problem solving session.**Sections:**[Chapter 1, Chapter 7]- March 14, Lecture 16:
**Topics:**More techniques in probability theory. Transformations of random variables. Order statistics. More on the moment generating function. More on variances and covariances via linear algebra. More on conditioning arguments.**Sections:**Chapter 8- March 26, Lecture 17:
**Topics:**More techniques in probability theory. Problem solving session.**Sections:**[Chapter 1, Chapter 8]