Back to Tom Salisbury's Home Page
MATH 2030 3.00 (York University)
Elementary Probability
Lecture Notes
The lecture notes linked to below cover the standard material treated
in MATH 2030  Elementary Probability, though the final topics in the
course will vary from year to year. The text most often used for this
course at York is Probability, by Pitman (Springer Verlag).
But when we follow that text, we don't necessarily cover all topics,
nor do we necessarily cover the topics in the strict order the textbook uses.
With this in mind, here is the basic list of topics I cover when I teach
this course, in the order I treat them.
Tom Salisbury

Part I:
 The model for probabilities, random variables, and events
(Sections 1.1  1.3) [omit: empirical distributions]
 Counting (appendix 1)

Part II
 Independence and Conditional Probability (Sections 1.4 and 1.6)
 Bayes rule (Section 1.5)

Part III
 Representing discrete and continuous distributions (Sections 3.1 [pp. 140141] and 4.1 [pp. 259271])
 Cumulative Distribution functions, their relation with densities
and discrete distributions (Section 4.5)
 Using cdf's to compute distributions
of transformed random variables (Section 4.5, pp. 320323)
 Part IV
 Binomial and Normal distributions (Sections 2.1 and 2.2),
 Normal approximations to the binomial distribution (Section 2.2) [omit: skew normal approximation]
 Expectations for discrete and continuous random variables (Section 3.2, and pp. 273275 of Section 4.1), including the method of indicators.
 Variances (pp. 185189 of Section 3.3), along with calculation of means and variances for distributions like the Binomial and Normal.
 The hypergeometric distribution (Sections 2.5 and 3.6)
 Part V
 Independence of random variables (p. 151 of Section 3.1), and its consequences for expectations and variances (p. 193 of Section 3.3).
 The law of large numbers, Chebyshev's inequality, and the Central Limit Theorem (pp. 191197 of Section 3.3)
 Normal approximations to more general
sums of independent random variables (p. 196 of Section 3.3).
 The Poisson distribution and Poisson approximation (Sections 2.4 and 3.5)
 The geometric distribution (Section 3.4)
 The exponential distribution and relation to the Poisson (Section 4.2)
 The negative binomial and gamma distributions (Sections 3.4 and 4.2: were mentioned, but you're not responsible for them.)
 Joint and marginal distributions, correlation and covariance (for discrete r.v. only) (Sections 3.1 and 6.4)