The general objective of this course is to introduce relevant tools for the analysis and processing of high-dimensional data or data available in large quantities, such as "sketching" approaches, random projections, randomized principal component analysis, Nyström approximation, and high-dimensional linear algebra methods. These tools will also be integrated into adapted optimization schemes, including regularized methods with sparse and low-rank models (for instance, when solving inverse problems), derivative-free iterative schemes and optimization on differential manifolds.
The course will also delve into certain algorithms, such as stochastic gradient descent and the Adam method, as well as some key theoretical elements of deep machine learning (such as the expressiveness of neural networks), thus avoiding a "black box" vision of these approaches.
This course will be supported by lectures, exercise sessions and digital practical work on Python. It will be taught by 2x2 professors, every other year, starting from L. Jacques and G. Grapiglia for 24-25, and E. Massart and P.A. Absil for 25-26. It takes its place in the MAP program under the LINMA2300 course acronym, whose current title, "Analysis and control of distributed parameter systems", will be changed to "High-Dimensional Data Analysis and Optimization" (the title change will, however, be effective in 2025-2026; the course will also probably be moved to the second semester in that year).
- Leraar: Absil Pierre-Antoine
- Leraar: Jacques Laurent
- Leraar: Massart Estelle
- Leraar: Massion Bastien
- Leraar: Mil-Homens Cavaco Nicolas
- Leraar: Nunes Grapiglia Geovani