Optimization methods for Data Science

Course objectives

General goals: The aim of the course is to introduce students to the theory and applications of optimization techniques for machine learning problems. Also, students are expected to acquire knowledge about standard models used in machine learning, such as Deep Neural Networks and Support Vector Machines. Specific goals: The course will put a special emphasis on convex optimization techniques which play a key role in data sciences. The objective of this course is to provide basic tools and methods at the core of modern nonlinear convex optimization. Starting from the gradient descent method we will cover some state of the art algorithms, including proximal gradient methods, accelerated methods, stochastic subgradient method and randomized block-coordinate descent methods, which are nowadays very popular techniques to solve machine learning and inverse problems. The course will also cover topics in statistics in high dimension which is the typical scenario in machine learning problems. Knowledge and understanding: the student will learn about (1) mathematical formulation of the machine learning problem, (2) the latest optimization algorithms for solving machine learning problems and extract information from data (3) understanding the theory of convergence of such algorithms (4) develop computational skills for handling several important problems in data science. Applying knowledge and understanding: through several examples from applied sciences and lab sessions, the student will appreciate the importance of optimization techniques and will understand which algorithm is most appropriate to use in each context. Critical and judgmental skills: the student will be able to tackle with rigor a number of significant optimization problems and algorithms so as to become fully aware of the technicalities and main ideas behind the various approaches. This will stimulate the student's independent judgment. Communication skills: by studying the theoretical and practical aspects of optimization techniques the student will learn gradually to communicate with rigor and clarity. She will also learn that a proper understanding of the mathematical aspects of data science is one of the main skills to achieve effective communication. Learning skills: students will have the chance to have additional details on some specific topics.

Channel 1
SAVERIO SALZO Lecturers' profile

Program - Frequency - Exams

Course program
1. Basics on convex sets and functions. 2. Differentiability and convexity 3. Smooth convex optimization 3. Differential theory for nonsmooth convex functions 4. Duality theory (part 1). 5. The projected subgradient method. 6. Frank-Wolfe algorithm 7. Proximity operators and averaged operators 8. The proximal gradient algorithm 9. Accelerated proximal gradient algorithms 10. Elements of sparse estimation/recovery 11. Proximal methods for convex spectral functions 12. Stochastic optimization algorithms: SGD and randomized methods 13 Duality Theory (part 2) 14 Dual algorithms 15. Applications in machine learning: Support vector machine and concentration inequalities
Prerequisites
Calculus in several variables, Linear Algebra and Probability.
Books
S. Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press, 2004 Y. Nesterov, Introductory Lectures on Convex Optimization, Kluwer, 2024 S. Salzo, S. Villa, Proximal Gradient Methods for Machine Learning and Imaging, 2022 Lecture notes by Prof. Salzo
Frequency
Attendance is not mandatory
Exam mode
Continuous Assessment (40%) Continuous assessment monitors students’ progress throughout the course and may include: - homework assignments / mini-projects - implementation and analysis of optimization algorithms (e.g., gradient descent, Newton’s method, stochastic optimization) on real or synthetic data. - short written reports or notebooks documenting formulation, results, and discussion of findings. - in-class exercises or quizzes. Written/Oral Exam (60%): - a 2-hour written exam including both theoretical questions (proof-based or conceptual) and applied problems (algorithmic exercises, short computations). - optional oral discussion to verify understanding and reasoning depth.
Lesson mode
Lectures will be only in presence.
  • Lesson code10606725
  • Academic year2025/2026
  • CourseData Science
  • CurriculumSingle curriculum
  • Year1st year
  • Semester2nd semester
  • SSDMAT/09
  • CFU6