This is your basic Introduction to Probability unit, and can be separated into 5 different topics which I'm going to go over:
1. Basics of ProbabilityMost of this chapter isn't really assessed (one question on the both the mid-sem and exam, one or two questions on the first assignment), however it covers extremely important concepts. You go over Kolmogorov's axioms, which is the basis of probability theory, and learn some important notation along with De Morgan's laws. The most important part of this section, though, is the law of total probability and Bayes' Law. LTP will come up again in future problems, Bayes' Law is just an important thing. Basic conditioning is also covered. This first section is very similar to what you'll have seen from your high school probability days.
2. Discrete DistributionsThis chapter is all about common discrete distributions (from methods, you may remember binomial), including Bernoulli, binomial, Poisson, geometric and discrete uniform. You go over the PMFs of the distributions, what makes them a valid PMF (in reference to the axioms), and derive their means and variances through various methods. You also learn about basic concepts that can be applied to all distributions (such as moments, variance, etc.).
3. Continuous DistributionsThis chapter extends our knowledge into continuous distributions. You cover much more in this case, including continuous uniform, exponential, gamma (including the gamma function) and normal. Once again, you learn about the PDF and what makes them valid PDFs, and look at their CDFs. After this, you extend your knowledge into joint distributions (also applying this knowledge to discrete distributions), and look at conditional distributions.
4. Further TopicsThis is where you no longer expand on year 12 knowledge, and instead learn all new material. Important topics include transformation of distributions (for example,
), and combining distributions (for example,
). The latter you look at through the use of Moment Generating Functions (MGFs) and by convolution. You also look at the application of MGFs to find any moment of a distribution (the
th moment of
is defined as
). Conditioning is also explored in more depth, including a proper look at the law of total expectation and law of total variance (which are introduced in topics 2 and 3). You also explore the idea of the bivariate normal distribution, and then extend this into the multivariate normal distribution.
5. Limit TheoremsThis part of the course is very analytical - if you've done Real Analysis, you may find it helpful. However, having only done 1030 before this (I did do 1035, didn't understand most of the extra content...), I was able to understand the material fine. There are 5 laws you work with, each will be proven. Very little of this is assessed, just one question on the exam. The laws are,
- Markov's Inequality
- Chebyshev's Inequality
- The Weak Law of Large Numbers
- Central Limit Theorem
- The Strong Law of Large Numbers
You also go over the idea of "convergence in probability" compared to "true convergence". Personally, I found this chapter very interesting, and the best part of the unit.
Chapters 1-3 seem very fiddly and tedious, however the assignments are super interesting which made up for it. Chapters 4 and 5 were easily the most interesting, however it makes sense that they are left to the end.
There are also a LOT of parallels between this unit and the first 4 weeks of MTH2232 - in fact, everything in the first 3 topics (And some in the fourth and fifth) will be seen in 2232 before you get to it in 2222. So, if you're considering a stats major/minor, I highly suggest doing them together.
The overall difficulty isn't that high, however the unit as a whole is not as interesting as MTH2232. If you're looking for a fun unit, do MTH2232, however if you want to go onto further stats units, do this one for obvious reasons. If you have to put either MTH2232 or MTH2222 off for a year, do this one. It's not as fun, but it'll be much easier for you when you get to MTH2232 having seen all this before.