The field of high-dimensional statistics uses data whose effective dimension (read the number of parameters to be estimated) is much larger than dimensions considered in classical multivariate low-dimensional analysis. More than that, the number of parameters to be estimated may be larger than the available sample size, making classical low-dimensional techniques unusable in such settings as they cannot produce consistent estimators when the sample size is small compared to the number of parameters to be estimated. 

We introduce in this course the theory of regularized methods in high-dimensional statistics. We address first estimation and prediction problems, move to feature selection problems in high dimensions and concentrate then on tuning parameter procedures. We then present the challenges of statistical inference in this framework. After having acquired this knowledge, we tackle the high-dimensional graphical modeling framework.