Objectives:

The goals of the course are:
- To present the basic principles and techniques in Bayesian statistics.
- To show that problems tackled in an ad-hoc way in a frequentist setting can be solved systematically in a Bayesian framework.
- To understand and to be able to use Monte Carlo algorithms to sample from a joint posterior.
- To show how problems difficult to tackle in a frequentist setting can be solved in a Bayesian framework.

Course content:

This course is an introduction to Bayesian statistics. After defining subjective probabilities, the basic principles underlying Bayesian inference are presented through the estimation of a proportion. The same principles are used to compare proportions and rates. The estimation of a mean (variance) in a normal distribution is also studied when the variance (mean) is unknown.
Inference in multiparameter models is also tackled. The concepts of marginal and conditional posterior distributions, credible regions and predictive distributions are defined. It is first illustrated with the joint estimation of the mean and of the variance of a normal distribution. The comparison of two means of a normal distribution with known or unknown variance(s) is also tackled. A solution is obtained with the simulation of a random sample from the joint posterior distribution when the variances cannot be assumed equal. The multiple regression model and the ANOVA I model are also studied in a Bayesian framework.
The basic algorithms enabling to generate a random sample from the posterior distribution are presented as these are fundamental to make inference in complex models.

Prerequisite:

It is assumed that students have a basic training in probability, in inference and in the use of the statistics software R.

Planned activities:

Practicals will be organized to illustrate the concepts and techniques studied during the theoretical course. Some exercises will require the use of the R software and possibly of a more specialized software like JAGS or WinBUGS.