Machine learning and statistical learning
A.A. 2024/2025
Obiettivi formativi
The course introduces students to the most important algorithmical and statistical machine learning tools. The first part of the course focuses on the statistical foundations and on the methodological aspects. The second part is more hands-on, with laboratories to help students develop their software skills.
Risultati apprendimento attesi
Upon completion of the course students will be able to:
1. understand the notion of overfitting and its role in controlling the statistical risk
2. describe some of the most important machine learning algorithms and explain how they avoid overfitting
3. run machine learning experiments using the correct statistical methodology
4. provide statistical interpretations of the results.
1. understand the notion of overfitting and its role in controlling the statistical risk
2. describe some of the most important machine learning algorithms and explain how they avoid overfitting
3. run machine learning experiments using the correct statistical methodology
4. provide statistical interpretations of the results.
Periodo: Secondo semestre
Modalità di valutazione: Esame
Giudizio di valutazione: voto verbalizzato in trentesimi
Corso singolo
Questo insegnamento può essere seguito come corso singolo.
Programma e organizzazione didattica
Edizione unica
Responsabile
Periodo
Secondo semestre
Prerequisiti
The course requires basic knowledge in calculus, linear algebra, programming and statistics.
Modalità di verifica dell’apprendimento e criteri di valutazione
For the module Machine learning the exam consists of two parts:
1. Writing a paper of about 10-15 pages containing either a report describing experimental results (experimental project) or a in-depth analysis of a theoretical topic (theory project).
2. Taking a written test on all the topics covered in class.
For the Module Statistical Learning, the exam consists in preparing two individual projects using the package R, one on supervised and one on unsupervised learning. The project, code and dataset must be sent to the Professor 5 days before the exam. The projects will be discussed in an oral test, in which students will be asked to explain and discuss the methodological choices, the code, the results. The ability to communicate and the critical ability to interpret the results will be evaluated. The grade is computed by combining the projects evaluation and the oral examination.
The final grade of the exam is the average of the grades obtained in each module.
1. Writing a paper of about 10-15 pages containing either a report describing experimental results (experimental project) or a in-depth analysis of a theoretical topic (theory project).
2. Taking a written test on all the topics covered in class.
For the Module Statistical Learning, the exam consists in preparing two individual projects using the package R, one on supervised and one on unsupervised learning. The project, code and dataset must be sent to the Professor 5 days before the exam. The projects will be discussed in an oral test, in which students will be asked to explain and discuss the methodological choices, the code, the results. The ability to communicate and the critical ability to interpret the results will be evaluated. The grade is computed by combining the projects evaluation and the oral examination.
The final grade of the exam is the average of the grades obtained in each module.
Module Machine Learning
Programma
1. Introduction
2. The Nearest Neighbour algorithm
3. Tree predictors
4. Statistical learning
5. Hyperparameter tuning and risk estimates
6. Risk analysis of Nearest Neighbour
7. Risk analysis of tree predictors
8. Consistency, surrogate functions, nonparametric algorithms
9. Linear predictors
10. Online gradient descent
11. From sequential risk to statistical risk
12. Kernel functions
13. Support Vector Machines
14. Stability bounds and risk control for SVM
15. Boosting and ensemble methods
16. Neural networks and deep learning
2. The Nearest Neighbour algorithm
3. Tree predictors
4. Statistical learning
5. Hyperparameter tuning and risk estimates
6. Risk analysis of Nearest Neighbour
7. Risk analysis of tree predictors
8. Consistency, surrogate functions, nonparametric algorithms
9. Linear predictors
10. Online gradient descent
11. From sequential risk to statistical risk
12. Kernel functions
13. Support Vector Machines
14. Stability bounds and risk control for SVM
15. Boosting and ensemble methods
16. Neural networks and deep learning
Metodi didattici
Lectures
The goal of this course is to provide a methodological foundation to machine learning. The emphasis is on the design and analysis of learning algorithms with theoretical performance guarantees.
The goal of this course is to provide a methodological foundation to machine learning. The emphasis is on the design and analysis of learning algorithms with theoretical performance guarantees.
Materiale di riferimento
The main reference are the lecture notes available through the link ncesa-bianchismml.ariel.ctu.unimi.it/
The course makes heavy use of probability and statistics. A good textbook on these topics is:
Dimitri P. Bertsekas and John N. Tsitsiklis, Introduction to Probability (2nd edition). Athena Scientific, 2008.
Some good machine learning textbooks:
Shai Shalev-Shwartz e Shai Ben-David, Understanding Machine Learning: From Theory to Algorithms, Cambridge University Press, 2014.
Mehryar Mohri, Afshin Rostamizadeh e Ameet Talwalkar, Foundations of Machine Learning, MIT Press, 2012.
L. Devroye, L. Gyorfi, and G. Lugosi, A Probabilistic Theory of Pattern Recognition, Springer, 1996
The course makes heavy use of probability and statistics. A good textbook on these topics is:
Dimitri P. Bertsekas and John N. Tsitsiklis, Introduction to Probability (2nd edition). Athena Scientific, 2008.
Some good machine learning textbooks:
Shai Shalev-Shwartz e Shai Ben-David, Understanding Machine Learning: From Theory to Algorithms, Cambridge University Press, 2014.
Mehryar Mohri, Afshin Rostamizadeh e Ameet Talwalkar, Foundations of Machine Learning, MIT Press, 2012.
L. Devroye, L. Gyorfi, and G. Lugosi, A Probabilistic Theory of Pattern Recognition, Springer, 1996
Module Statistical Learning
Programma
1.Introduction to Statistical Learning
2. Cross Validation and Bootstrap
3. Variable Selection, Ridge and Lasso Regression
4. Linear Models
5. Non Linear Models
6. Logistic Regression and classification Methods
7. Classification and Regression Trees, bagging, boosting and Random Forest
8. Unsupervised learning (Clustering, PCA)
9. Brief notes on neural networks (tentative)
10. Brief notes on the association rules (tentative)
2. Cross Validation and Bootstrap
3. Variable Selection, Ridge and Lasso Regression
4. Linear Models
5. Non Linear Models
6. Logistic Regression and classification Methods
7. Classification and Regression Trees, bagging, boosting and Random Forest
8. Unsupervised learning (Clustering, PCA)
9. Brief notes on neural networks (tentative)
10. Brief notes on the association rules (tentative)
Metodi didattici
Lectures and Lab sessions
The goal of this module is to provide a methodological and practical overview to statistical learning methods. The emphasis is on the applications.
Optional group work will be offered to get familiar with the software and increase practical skills.
The goal of this module is to provide a methodological and practical overview to statistical learning methods. The emphasis is on the applications.
Optional group work will be offered to get familiar with the software and increase practical skills.
Materiale di riferimento
James, G., Witten, D., Hastie, T., & Tibshirani, R. (2021). An introduction to statistical learning, Springer.
A further reference is the textbook:
Hastie, T., Tibshirani, R., & Friedman, J. (2009). The elements of statistical learning: data mining, inference, and prediction. Springer Science & Business Media.
A further reference is the textbook:
Hastie, T., Tibshirani, R., & Friedman, J. (2009). The elements of statistical learning: data mining, inference, and prediction. Springer Science & Business Media.
Moduli o unità didattiche
Module Machine Learning
INF/01 - INFORMATICA - CFU: 6
Lezioni: 40 ore
Docente:
Cesa Bianchi Nicolo' Antonio
Module Statistical Learning
SECS-S/01 - STATISTICA - CFU: 6
Lezioni: 40 ore
Docente:
Salini Silvia
Docente/i
Ricevimento:
Il ricevimento studenti è in presenza, per appuntamento, il venerdì dalle 9.30 alle 11.00 e via Teams, per appuntamento, il lunedì dalle 15.00 alle 16.30.
DEMM, stanza 30, 3° p