This is one of the third year electives for a Statistics major. Completion of this course along with the three core gets accreditation with the Statistics Society of Australia.
Bayesian inference stems from a probabilistic approach of inference - it literally falls out of Bayes rule. In the classical frequentist approach, parameters to be estimated were fixed, but Bayesian approaches treat the parameter itself as a random variable, consequently invoking lots more probabilistic techniques (credible intervals, hypothesis tests, expectation of the parameter, predictive distribution etc.)
This course also introduced simulation techniques. Basic methods (inverse transform, accept/reject method) were covered but there was a lot of depth put into Markov-chain Monte Carlo.
The computations in this course are quite interesting. On one hand, some of them are fairly straightforward thanks to the shortcuts you're introduced in weeks 1 and 2. But then at other times they get completely chaotic and it feels a bit like a war trying to fight through all of it (cough Bayes factors). A part of the course was recognising distributions, because that helped you simplify down nasty integrals (including multivariate integrals).
Those tricks were so convenient though. Trivialised pretty much half of the computations you saw in this course.
The simulations were examined through making you do a few computations in advance and also writing pseudocode. For example, with the usual rejection sampling you had to understand high school optimisation to find the optimal enveloping constant. But you pretty much just had to adapt your distributions/values/etc. to the algorithm itself to write out the pseudocode, and there was no strict style guide for it either.
Much like with combinatorics last sem, I found I actually liked this course despite having various difficult concepts. It helped that the tutorials/assignments/exam were all made fairer by the new lecturer (this course used to be a 5/5 difficulty course). But it was still pretty easy to get lost in the lectures because the lecture examples were much harder to grasp (a lot of multivariate computations).
You did need to know all the definitions, techniques and tricks the course teaches you to do well in the exam. A bit of all of that was asked.