Applied mathematics for sustainability: models calculations AI
Channel 1
DOMENICO VITULANO
Lecturers' profile
Program - Frequency - Exams
Course program
The course unit consists of 60 hours duration for a total of 6 ECTS. Arguments of the teaching modules are the following.
1) LINEAR ALGEBRA (review):
Vector spaces. Geometric representation of vectors and vector operations. Sets of linearly dependent and independent vectors. Rank of a set of vectors. Dimension and basis of a vector space. Vector subspaces. Determinant of a square matrix. Characteristic or rank of a matrix, Rouché–Capelli theorem, and Cramer’s theorem. Numerical and parametric systems. Homogeneous linear systems. Change of basis in a vector space and its consequences on the components of a vector.
2) LINEAR APPLICATIONS BETWEEN TWO FINITE-DIMENSIONAL VECTOR SPACES:
General concepts: kernel and image. Association of a matrix with a linear map. Change of basis: overview. Eigenvalues and eigenvectors. Theorem on the relationship between eigenvalues and the (in)dependence of associated eigenvectors. Diagonalization of a linear map.
3) QUADRATIC FORMS:
Characterization of a quadratic form. Method of minors. Eigenvalue method. Theorem on the necessary and sufficient condition for the sign of a quadratic form. Canonical form of a quadratic form. Constrained quadratic form.
4) VECTOR CALCULUS:
Functions of several variables. Domain of definition. Directional derivatives. Partial derivatives. Higher-order partial derivatives. Schwarz’s theorem. Differentiability. Gradient and total differential. Gradient formula. Tangent hyperplane. Hessian matrix. Jacobian matrix. Homogeneous functions. Euler’s theorem. Taylor’s formula for functions of several variables.
5) STATIC OPTIMIZATION:
Extrema, local extrema. Stationary points. Saddle points. Unconstrained optimization. Second-order necessary and sufficient conditions (methods of minors, eigenvalues, and characteristic polynomial). Convex functions. Constrained optimization. Explicit method. Lagrange multipliers method. Bordered Hessian matrix. Second-order necessary and sufficient conditions. General case: n functions and p constraints.
Introduction to Matlab with its main functions.
Fundaments of machine learning
History. The main pillars of a machine learning system: data, models, learning.
Data as vectors. Models as functions.
Three main phases: Prediction (or inference). Training (or parameter estimation).
Model selection (or hyperparameter tuning). Least squares from a numerical point of view
and from a machine learning point of view.
Training, Validation and testing.
Practical examples in economy and exercises in Matlab
Dimensionality reduction: history and Principal Component Analysis. Maximum variance perspective.
Covariance matrix, eigenvalues and eigenvectors. MNIST dataset. Examples in Matlab.
Classification. Unsupervised learning, Supervised learning, Reinforcement learning.
Support Vector Machine. Hyperlanes. Traditional derivation of margin, Soft margin,
Dual Support Vector Machine. Simulations in Matlab
Prerequisites
Fundaments of linear algebra
Books
Sergio Bianchi, “Appunti di Algebra lineare”.
Neural Networks and Deep Learning: A Textbook, Charu C. Aggarwal, Springer
MATHEMATICS FOR MACHINE LEARNING, Marc Peter Deisenroth, A. Aldo Faisal, Cheng Soon Ong, Cambridge University Press
Available material: https://elearning.uniroma1.it/ (cercare MML2DV)
Frequency
Attendance is not mandatory
Exam mode
Exercises solved in the classroom and in Matlab with a final (written) report
Lesson mode
Lectures through slides and practice in laboratory (Matlab)
SILVIA MARCONI
Lecturers' profile
- Lesson code10616812
- Academic year2025/2026
- CourseEconomics and management
- CurriculumManagement delle imprese e della sostenibilità
- Year1st year
- Semester2nd semester
- SSDSECS-S/06
- CFU9