This is an optional unit for the Statistics and Stochastic Processes major, but if you intend to seriously pursue statistics, you should take it. At last you'll get to learn about statistical techniques from the latter half of the 20th century. (Want to see 21st century techniques? You'll need to do a Masters!) This is the first year that Owen Jones taught the course - previously it was Guoqi Qian - and the content has changed completely, from covering a lot of material with little depth, to covering two methods in glorious detail.
The first half of the course is on Generalised Linear Models. These have a silly name, because General Linear Models (note the lack of "ised") are also a thing that exists, but are different. These provide a theory which encompasses a whole bunch of different regression techniques: linear regression, binomial regression, Poisson regression, multinomial regression... The material is quite technical and has a lot of formulas and methods you need to know for dealing with different types of data.
The second half of the course is on Bayesian statistics. I really enjoyed this part of the course, because it was totally unlike anything I'd seen before. Rather than treating the parameter values in your model as being fixed and the data as being subject to random errors, you treat the data as being fixed and the parameter values as being subject to random variation. This provides a completely different but mathematically elegant approach to statistics, which can be expressed intuitively as "I am updating my beliefs about the model based on observed data". The actual equations end up being impossible to solve exactly most of the time, but we went into detail on the methods to calculate them numerically (Metropolis-Hastings algorithm and the Gibbs Sampler). These numerical methods are quite general and allow you far more flexibility with your statistical models than traditional methods (which have assumptions like "this thing has a normal distribution" already baked into the formulas you use). It's a shame that nobody at Melbourne Uni does much Bayesian stats, because I would love to learn more about this.
There were also a couple of interludes on computational methods, covering numerical optimisation and simulating random variables of different distributions. None of it is particularly tricky but if you haven't done any computer programming at all it might catch you out. I slept through these lectures because I (thought I) had seen it all before. But don't think "oh this computer stuff will never turn up in the exam" - we had questions where we had to describe an algorithm in words and describe what a piece of code would do.
Lectures: Most of the subject is taught in the old fashioned "copy down everything from the blackboard" way, which I like. It means that the course can't go any faster than the speed of handwriting, which is important when tricky mathematical proofs are involved. Unfortunately, Owen's handwriting is terrible. So are the jokes that he tells (at one point he tried to make a bilingual pun in Latin and English).
Computer labs: Not particularly exciting, but some of these introduce new material as well as providing a more practical/applied take on the course. Some of the exam questions had a striking resemblance to problems from the labs.
Exam: You're allowed to bring in a page of handwritten notes which means that there's no memory work. The exam had a little bit of plugging things into equations but was mostly testing conceptual understanding. So that's nice. Which is not to say that the exam was easy!