University Subjects

MAST30020: Probability and Statistical Inference

MAST30020: Probability and Statistical Inference

University
University of Melbourne
Subject Link
View Subject

Subject Reviews

huy8668

5 years ago

An Intuitive Feel For What The Subject Is About
Probability for Inference (PFI) can be thought of as a more (surprise surprise) rigorous and ‘purer’ (in a mathematical sense) of MAST20004 Probability. Though both are introductory courses to probability, one focuses more on the computation and ‘applied’ side in second year Probability while in third year PFI, topics are rigorously constructed from the ground up. Proofs are also the main focus of the subject, instead of computations. As a result, students often find PFI much more difficult due to the rigor that they are not used to seeing in a seemingly very ‘applied’ maths course. In terms of topics covered, some covered in Probability like Moment generating Functions will not be covered in PFI and conversely, PFI will introduce some new topics such as Characteristic Functions. Of course, there are still overlapping topics as they are both introductory courses, just taught from different perspective but do not be fooled in thinking that you won’t have to spend a good amount of effort for the topics you’ve already done in Probability, as the newly introduced rigor will really catch students off-guard on these seemingly elementary topics.
Assessment
30% Assignments and 70% Exam
Assignments And Tutorials
There are 10 weekly assignments in addition to the weekly tutorials. Regarding tutorials, some of the questions require experience to tackle from scratch while some are more manageable. Unlike most other maths subjects, tutors go through all the questions during tutorials from start to finish. Regarding assignments, each assignment consists of a couple of questions, which were not too lengthy. Together, the assignments account for 30% of the subject marks. Opinions on the difficulty were mixed, some students find to be rather arduous to work through while others find it ok. Depending on your style, you may wish to tackle the assignment alone or in a group. Together with the assignments, you also need to complete a summary sheet, which you have to summarise the weekly content of the lectures. In saying this, I acknowledge that there is a popular opinion where students find the sheer volume of the assignments to be too much work. There are just too many assignments that students need to complete and some even say that one assignment in Probability (which there were only like 4 for the entire semester) did not take as long as one assignment in PFI to complete. I personally find the bolded opinion to be almost always true, though in Probability, we have to attempt problems from the booklet as well and there are no booklets in PFI so it equals out. I think it is just Kostya's way of making sure that students work on the material regularly. Now, do these 10 weekly assignments really prove to be a huge workload for students? I honestly find that this is not the case. Tutorials are provided with solutions so it is only a couple of questions per week that we have to complete. Most of the work comes from understanding the lecture material, I reckon, not the assignment question.
Comments

This review is aimed towards those who have completed MAST20004 Probability or MAST20006 Probability for Statistics. I will try inform readers as much as possible regarding the content of the subject while attempting to keep it relatable (I reckon just bombarding you with abstract and possibly unfamiliar terms is not so helpful). After all, this is also an opinion piece as well as an objective review so my opinion and experience in the subject will be ubiquitous throughout the review, too.

You will find that the words ‘rigor’ or ‘rigorous’ will be used very often to signify how the author simply has poor vocabulary and knows no better word. More importantly though, it is to emphasise the fact that Probability for Inference is much more rigorous than Probability and really should not be taken lightly as a ‘repetition of Probability but a bit harder’. Please also note that the word ‘probability’ is used in many different ways: Probability is the subject MAST20004 Probability, probability can also be a field of mathematics or a function. It should be obvious to readers what the author means though (hopefully).
Exam
The exam is quite a typical pure maths exam, with lots of proof questions and some computation questions. I hate to say this but one cannot really judge the difficulty of the exam because it really depends on the amount of resources you’ve put in during the semester. All the topics are examinable, except for the last maybe 5 – 10 slides. The first three questions follow a certain format and the questions get unpredictable onwards. It is a lengthy exam and of course, you need good speed and accuracy in order to finish it without making too many silly mistakes.
One could find the exam quite fair if one spent quite a bit of time studying the material while others may find it extremely difficult as they could not give a fair share of their time to the subject. Long story short, the harder + smarter you study, the better you do. Question is, what is studying smart? I’ve been trying to answer this question for a very long time now, and to avoid making this review too long, the short generic answer I can personally give is that do not spam exams and learn exams for revision. Revise the lecture notes and tutorials and assignments, rather. Exams should be employed but only as a ‘sharpening tool' and not a replacement for the knife making machine – lecture notes, tutorials, assignments. In addition, please do not make the mistake of predicting exams. I’d love to go on but this is digressing. Please shoot me a pm should you like to discuss these studying techniques. I’m very interested!
Final Thoughts
All I can really say is that PFI is a very similar animal to Probability and yet, incredibly different. One could say that PFI is much more difficult but it is best left for the current students of the subject to judge it for themselves. Like most subjects, if you spend resources and have good studying strategies, you’ll find the subject ok. On the other hand, if you do not have a quite strong maths background nor studying strategies, for example, you’ll find that this is a nightmare. To do well in PFI, you’d need to put in a lot of work but this is also the case for Probability. Likewise, it is not too difficult to score above a 70 in PFI either, provided that you put in an honest effort. Of course, one might initially find PFI to be seemingly more difficult due to the rigor but like most things, one eventually will get the hang of it and things become much more manageable.
Personally, I put in quite a bit of effort into this subject and in hindsight, I found everything to be quite fair. Frankly, I knew the material (lecture slides, tutorials, assignments) quite well. However, it took me the first few weeks to get the hang of everything which made my few initial assignments suffered quite a bit. Thankfully, things eventually clicked and I worked even more diligently as the semester progresses, putting me at a 27/30. On the exam, although frankly, the questions were doable if given enough time, my lack of exam experience did not allow me to complete them all within the given constraint, giving me a final grade of 88/100.
Lectopia Enabled
Yes, with screen capture etc.
Lecturer(s)
Konstantin (Kostya) Borovkov
Lectures And Lecturer
The lecturer, Kostya has written his set of slides for the entire course which he makes available at the beginning of the course. This means that we have access to the whole course lecture material from the beginning of the semester. The slides are quite informative and really, almost all of what you need to know is on there.
The lectures follow a conventional format of the lecturers going through and discussing his slides. Recordings were available, fortunately and I made extensive use of it.
Personally, I find that Kostya is a quite humorous and knowledgeable lecturer and he stands out from the other lecturers thanks to this humour that he provides in the lectures. I also quite enjoy his philosophy on studying, which I was luckily able to find out about through our conversations in his consultation. Both he and Ai Hua enjoy sharing their philosophies with students. Generally, they’re pretty cool people to be around, especially for students.
Past Exams Available
Yes, 2 past exams with solutions. We were given the 2012 and 2013 while the lecturer discussed the 2017 with us together.
Rating
5 Out of 5
Some Informations On The Topics Covered In P F I
Probability spaces: Here we introduce the tools required to quantify a random experiment such as different type of sample spaces, indicator functions, σ – algebra, events. A few items in this list are familiar to students of Probability though most are new. We then rigorously introduce the function P⁡(⋅) which we call a ‘probability’ and discuss its elementary + advanced properties like continuity, monotonicity, Borel – Cantelli Lemma.
Probabilities on R: Now that we know what the function ‘probability’ is, we can talk about a specific probability, one defined on R (it’s actually defined on B(R) but I’m guessing you probably don’t care, yet, right?).
Random Variable/Vectors: arguably, one could say that probability is the study of random variables (RV) and random vectors (RVec). This is why we introduce it here, in a rigorous fashion of course as it is what this subject revolves around. We will look at different properties of RV’s and Rvec’s (which are just the higher dimensional version of RV’s). Once done, we can introduce the concept of independence between events. Now, this is probably the right time to introduce order statistics as this subject. After all, this is Probability For Inference (meaning there will be statistics will be built on the probability foundation we’ve laid) and this is where statistics start to enter the game.
Expectation: This is the first tool for playing around with RV’s (and Rvec’s of course).
‘Of course, it is just a repetition of expectation in second year probability and so students should probably just not worry about it and chill. Hey, maybe it’s the perfect time to catch up on other subects’ – is what I would like to say. Except that… SORRY!! DO NOT MAKE THIS MISTAKE. What you’ve been introduced to in Probability are just computational formula, not the definition of expectation. Here, we rigorouly define expectation and discuss its properties and applications.
Conditional Expectation: Similar to expectation, one should be very careful as this is a very different animal compare to one introduced in Probability. In fact, the problems you see here in PFI regarding conditional expectation is completely different from those in Probability.
Oh and conditional expectation has a nice geometric interpretation as well, btw.
Some applications to Statistics: Now that we’ve got all the tools we need, it is time to apply them and application to statistic is our first stop. Topics from MAST20005 Statistics like Sufficient Statistics, Neyman – Fisher Factorisation, Maximum Likelihood Estimators will be introduced (don’t worry, you do not need to have done MAST20005 Statistics before hand). Here though, we will prove them, instead of just focusing on the computational side of things. Other topics introduced include Bias Estimators, Efficient Estimators and its uniqueness, Rao – Blackwell Theorem.
Convergence of random variables: To those of you who did not enjoy MAST20026 Real Analysis (it is a prerequisite), this topic can be a nightmare. This is probably the most ‘analysis’ part of the course. Meaning, we analyse RV’s as a function just like how we did with ‘regular’ functions in MAST20026 Real Analysis. Basically, one can have a sequence of RV’s which will converge in different ways to something as the sequence goes on forever. This is however, just a tool. The main focus are the applications of these, namely the LawS of Large Numbers, the Law of Small Numbers.
Characteristic functions: If you enjoy computations then this topic is for you. Oh but do not forget the rigor in your computation 😊 (whatever that means). Here we are introduced to the powerful tool of Characteristic Functions (ChF) which are the older brother of Moment Generating Functions (MGF) and Probability Generating Functions (PGF). These ChF guys are guaranteed to exist (unlike MGF) and works even if the RV is not discrete (unlike PGF). The purpose of introducing these guys is to aid us in proving convergence of RV’s as the RV’s are very much married to these ChF. They go hand-in-hand together, almost.
Further applications to Statistics: Finally, we revisit the MLE, introduce a new concept of Empirical Distribution Function (EDF) and discuss its properties. Not much to say here other than the fact that the last few slides aren't examinable.
Textbook Recommendation
Alan Karr - Probability. It is ok, it gives a different perspective and aid with independent learning.
Workload
Weekly lectures x 3, tutorials x 1, assignments x 1 (total of 10 assignments). Assignments consist of problems to complete and a summary sheet to write up.
Year & Semester Of Completion
2019 Semester 1
Your Mark / Grade
Wow wow hold your horses man, we just met? Jk jk, marks are discussed later, please keep reading!

Did you find this review helpful?

cameronp

10 years ago

Assessment
10x weekly assignments (total 20%), three hour exam (worth 80%).
Comments
This is the toughest and most "pure" of the third year statistics subjects. The first half of the course is abstract and rigorous, building up probability theory from the basics of set theory and defining expectation of random variables using a new type of integral (the Lebesgue integral from measure theory). The second half of the course uses these formal definitions to prove a number of important results in statistics, as well as introducing new tools like characteristic functions that make certain calculations and proofs a lot easier.

Prof. Borovkov is an excellent lecturer and if you don't attend the lectures and workshops, you'll miss out on a lot of the benefit of the course. The PDF slides cover the technical details of the material but the lectures provide the intuition and mental pictures that will help you solve the homework problems. A typical lecture may only go through 6 slides as every point is explained in detail with proofs, examples or pictures drawn on the blackboard.

To do well in the weekly assignments will take a lot of time and effort, well out of proportion to the marks they're worth - but it will help you when it comes to the exam.

Highly recommended for pure mathematicians wanting to get a bit of stats in their diet, statisticians with a theoretical bent, or anyone intending to pursue a Masters degree in probability and statistics.
Lectopia Enabled
No, although lecture slides for the whole course in PDF form are made available at the start of semester.
Lecturer(s)
Prof. Kostya Borovkov
Past Exams Available
Yes, with solutions for some of them.
Rating
5/5
Textbook Recommendation
The recommended textbook can be downloaded from the University website. I wouldn't bother purchasing a paper copy, the lecturer's slides are usually clearer and cover more than the textbook(!)
Workload
3x 1 hour lectures, 1x 1 hour workshop.
Year & Semester Of Completion
2014, Semester 1
Your Mark / Grade
H1

Did you find this review helpful?

Study Honours at the no.1 university in Australia

Open to students from all universities, Honours in Biomedical and Health Sciences builds on your bachelor’s degree in science or health and enables you to explore your interests in research. If you’re interested in pursuing a PhD or becoming a qualified health professional, then Honours is an ideal pathway.

Find out more