Signal Processing for Machine Learning

Course objectives

Eng Objectives The goal of the course is to teach basic methodologies of signal processing and to show their application to machine learning and data science. The methods include: (i) Standard tools for processing time series and images, such as frequency analysis, filtering, and sampling; (ii) Sparse and low-rank data models with applications to high-dimensional data processing (e.g., sparse recovery matrix factorization, tensor completion); (iii) Graph signal processing tools, suitable to analyze and process data defined over non-metric space domains (e.g., graphs, hypergraphs, topologies, etc.) with the aim of performing graph machine learning tasks such as graph filtering, spectral clustering, topology inference from data, and graph neural networks. Finally, it is shown how to formulate and solve machine learning problems in distributed fashion, suitable for big data applications, where learning and data processing must be necessarily performed over multiple machines. Homeworks and exercises on real-world data will be carried out using Python and/or Matlab. Specific Objectives: 1. Knowledge and understanding: Learn the basics of signal processing for machine learning and be able to apply these concepts to real data science problems. 2. Application: Apply signal processing and machine learning techniques to real-world data sets, using programming languages such as Python and Matlab. 3. Autonomy of judgement: Analyze the benefits and limitations of different signal processing tools and models and determine the best methodology to use for a given data set. 4. Communication: Communicate effectively about signal processing for machine learning, including design constraints, solutions, and potential applications. 5. Learning skills: Develop studies in the field of signal processing for machine learning, including the ability to undertake research in this area.

Channel 1
SERGIO BARBAROSSA Lecturers' profile

Program - Frequency - Exams

Course program
Part 1: Signal Processing Methods and Applications Definition of signals, signal properties, discrete representations, Fourier transforms, filtering, sampling theory, applications to audio signals and images. Basics of convex optimization: Convex sets, convex functions, convex optimization problems. Sparse representations, compressive sensing, application to image recovery. Matrix completion, application to recommendation systems. Sparse plus low-rank models, Application to traffic prediction over networks. Part 2: Graph Signal Processing and Learning Algebraic graph theory: Graph properties, connectivity, degree centrality, eigenvector centrality, PageRank, betweeness, modularity, spectral clustering. Independence graphs: Markov networks, Bayes networks, Gaussian Markov Random Fields, inference of graph topology from data. Application to brain functional connectivity inference. Graph signal processing: Graph Fourier transform, graph filtering, sampling and interpolation of graph signals. Diffusion processes, graph convolutional filters. Application to distributed filtering and signal interpolation. Deep Learning on graphs: Design of graph neural architectures, pooling, attention mechanisms. Applications to graph and node classification problems. Part 3: Distributed Optimization and Learning Distributed consensus optimization: Consensus over networks, distributed gradient descent, convergence analysis, communication architectures, adding structure (e.g., constraints, sparsity, etc.) via proximal operators. Application to distributed target localization in wireless networks. Federated learning: Federated averaging, examples of federated learning problems, Application to federated support vector machines. Challenges in Federated Learning: Expensive communications, System and Statistical heterogeneity, Privacy. Federated convex learning: Basics of duality theory, primal-dual optimization methods (dual ascent, method of multipliers, ADMM). Distributed ADMM with splitting across examples and/or features. Application to federated convex learning problems.
Prerequisites
Basic knowledge of calculus, probability theory and algebra
Books
Vetterli, Martin, Jelena Kovačević, and Vivek K. Goyal. Foundations of signal processing. Cambridge University Press, 2014. S. Foucart and R. Holger, A mathematical introduction to compressive sensing, Basel: Birkhäuser, 2013. S. Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press, 2004; [M.E.J. Newman, Networks: An Introduction, Oxford, UK: Oxford University Press. S. Boyd et al., Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers, Foundations and Trends in Machine Learning, 3(1):1–122, 2011.
Frequency
It is not mandatory to attend the classes, but it is highly recommended
Exam mode
Oral exam (typically two open questions), plus a computer project carried out over one of the topics of the course.
PAOLO DI LORENZO Lecturers' profile
  • Lesson code10610252
  • Academic year2025/2026
  • CourseData Science
  • CurriculumSingle curriculum
  • Year2nd year
  • Semester1st semester
  • SSDING-INF/03
  • CFU6