Probability spaces: Here we introduce the tools required to quantify a random experiment such as different type of sample spaces, indicator functions, σ algebra, events. A few items in this list are familiar to students of Probability though most are new. We then rigorously introduce the function P(⋅) which we call a probability and discuss its elementary + advanced properties like continuity, monotonicity, Borel Cantelli Lemma.
Probabilities on R: Now that we know what the function probability is, we can talk about a specific probability, one defined on R (its actually defined on B(R) but Im guessing you probably dont care, yet, right?).
Random Variable/Vectors: arguably, one could say that probability is the study of random variables (RV) and random vectors (RVec). This is why we introduce it here, in a rigorous fashion of course as it is what this subject revolves around. We will look at different properties of RVs and Rvecs (which are just the higher dimensional version of RVs). Once done, we can introduce the concept of independence between events. Now, this is probably the right time to introduce order statistics as this subject. After all, this is Probability For Inference (meaning there will be statistics will be built on the probability foundation weve laid) and this is where statistics start to enter the game.
Expectation: This is the first tool for playing around with RVs (and Rvecs of course).
Of course, it is just a repetition of expectation in second year probability and so students should probably just not worry about it and chill. Hey, maybe its the perfect time to catch up on other subects is what I would like to say. Except that
SORRY!! DO NOT MAKE THIS MISTAKE. What youve been introduced to in Probability are just computational formula, not the definition of expectation. Here, we rigorouly define expectation and discuss its properties and applications.
Conditional Expectation: Similar to expectation, one should be very careful as this is a very different animal compare to one introduced in Probability. In fact, the problems you see here in PFI regarding conditional expectation is completely different from those in Probability.
Oh and conditional expectation has a nice geometric interpretation as well, btw.
Some applications to Statistics: Now that weve got all the tools we need, it is time to apply them and application to statistic is our first stop. Topics from MAST20005 Statistics like Sufficient Statistics, Neyman Fisher Factorisation, Maximum Likelihood Estimators will be introduced (dont worry, you do not need to have done MAST20005 Statistics before hand). Here though, we will prove them, instead of just focusing on the computational side of things. Other topics introduced include Bias Estimators, Efficient Estimators and its uniqueness, Rao Blackwell Theorem.
Convergence of random variables: To those of you who did not enjoy MAST20026 Real Analysis (it is a prerequisite), this topic can be a nightmare. This is probably the most analysis part of the course. Meaning, we analyse RVs as a function just like how we did with regular functions in MAST20026 Real Analysis. Basically, one can have a sequence of RVs which will converge in different ways to something as the sequence goes on forever. This is however, just a tool. The main focus are the applications of these, namely the LawS of Large Numbers, the Law of Small Numbers.
Characteristic functions: If you enjoy computations then this topic is for you. Oh but do not forget the rigor in your computation 😊 (whatever that means). Here we are introduced to the powerful tool of Characteristic Functions (ChF) which are the older brother of Moment Generating Functions (MGF) and Probability Generating Functions (PGF). These ChF guys are guaranteed to exist (unlike MGF) and works even if the RV is not discrete (unlike PGF). The purpose of introducing these guys is to aid us in proving convergence of RVs as the RVs are very much married to these ChF. They go hand-in-hand together, almost.
Further applications to Statistics: Finally, we revisit the MLE, introduce a new concept of Empirical Distribution Function (EDF) and discuss its properties. Not much to say here other than the fact that the last few slides aren't examinable.