Machine Learning and Statistical Learning
A.Y. 2024/2025
Learning objectives
The course introduces students to the most important algorithmical and statistical machine learning tools. The first part of the course focuses on the statistical foundations and on the methodological aspects. The second part is more hands-on, with laboratories to help students develop their software skills.
Expected learning outcomes
Upon completion of the course students will be able to:
1. understand the notion of overfitting and its role in controlling the statistical risk
2. describe some of the most important machine learning algorithms and explain how they avoid overfitting
3. run machine learning experiments using the correct statistical methodology
4. provide statistical interpretations of the results.
1. understand the notion of overfitting and its role in controlling the statistical risk
2. describe some of the most important machine learning algorithms and explain how they avoid overfitting
3. run machine learning experiments using the correct statistical methodology
4. provide statistical interpretations of the results.
Lesson period: Second semester
Assessment methods: Esame
Assessment result: voto verbalizzato in trentesimi
Single course
This course can be attended as a single course.
Course syllabus and organization
Single session
Responsible
Lesson period
Second semester
Prerequisites for admission
The course requires basic knowledge in calculus, linear algebra, programming and statistics.
Assessment methods and Criteria
For the module Machine learning the exam consists of two parts:
1. Writing a paper of about 10-15 pages containing either a report describing experimental results (experimental project) or a in-depth analysis of a theoretical topic (theory project).
2. Taking a written test on all the topics covered in class.
For the Module Statistical Learning, the exam consists in preparing two individual projects using the package R, one on supervised and one on unsupervised learning. The project, code and dataset must be sent to the Professor 5 days before the exam. The projects will be discussed in an oral test, in which students will be asked to explain and discuss the methodological choices, the code, the results. The ability to communicate and the critical ability to interpret the results will be evaluated. The grade is computed by combining the projects evaluation and the oral examination.
The final grade of the exam is the average of the grades obtained in each module.
1. Writing a paper of about 10-15 pages containing either a report describing experimental results (experimental project) or a in-depth analysis of a theoretical topic (theory project).
2. Taking a written test on all the topics covered in class.
For the Module Statistical Learning, the exam consists in preparing two individual projects using the package R, one on supervised and one on unsupervised learning. The project, code and dataset must be sent to the Professor 5 days before the exam. The projects will be discussed in an oral test, in which students will be asked to explain and discuss the methodological choices, the code, the results. The ability to communicate and the critical ability to interpret the results will be evaluated. The grade is computed by combining the projects evaluation and the oral examination.
The final grade of the exam is the average of the grades obtained in each module.
Machine Learning and Statistical Learning-Module Machine Learning
Course syllabus
1. Introduction
2. The Nearest Neighbour algorithm
3. Tree predictors
4. Statistical learning
5. Hyperparameter tuning and risk estimates
6. Risk analysis of Nearest Neighbour
7. Risk analysis of tree predictors
8. Consistency, surrogate functions, nonparametric algorithms
9. Linear predictors
10. Online gradient descent
11. From sequential risk to statistical risk
12. Kernel functions
13. Support Vector Machines
14. Stability bounds and risk control for SVM
15. Boosting and ensemble methods
16. Neural networks and deep learning
2. The Nearest Neighbour algorithm
3. Tree predictors
4. Statistical learning
5. Hyperparameter tuning and risk estimates
6. Risk analysis of Nearest Neighbour
7. Risk analysis of tree predictors
8. Consistency, surrogate functions, nonparametric algorithms
9. Linear predictors
10. Online gradient descent
11. From sequential risk to statistical risk
12. Kernel functions
13. Support Vector Machines
14. Stability bounds and risk control for SVM
15. Boosting and ensemble methods
16. Neural networks and deep learning
Teaching methods
Lectures
The goal of this course is to provide a methodological foundation to machine learning. The emphasis is on the design and analysis of learning algorithms with theoretical performance guarantees.
The goal of this course is to provide a methodological foundation to machine learning. The emphasis is on the design and analysis of learning algorithms with theoretical performance guarantees.
Teaching Resources
The main reference are the lecture notes available through the link ncesa-bianchismml.ariel.ctu.unimi.it/
The course makes heavy use of probability and statistics. A good textbook on these topics is:
Dimitri P. Bertsekas and John N. Tsitsiklis, Introduction to Probability (2nd edition). Athena Scientific, 2008.
Some good machine learning textbooks:
Shai Shalev-Shwartz e Shai Ben-David, Understanding Machine Learning: From Theory to Algorithms, Cambridge University Press, 2014.
Mehryar Mohri, Afshin Rostamizadeh e Ameet Talwalkar, Foundations of Machine Learning, MIT Press, 2012.
L. Devroye, L. Gyorfi, and G. Lugosi, A Probabilistic Theory of Pattern Recognition, Springer, 1996
The course makes heavy use of probability and statistics. A good textbook on these topics is:
Dimitri P. Bertsekas and John N. Tsitsiklis, Introduction to Probability (2nd edition). Athena Scientific, 2008.
Some good machine learning textbooks:
Shai Shalev-Shwartz e Shai Ben-David, Understanding Machine Learning: From Theory to Algorithms, Cambridge University Press, 2014.
Mehryar Mohri, Afshin Rostamizadeh e Ameet Talwalkar, Foundations of Machine Learning, MIT Press, 2012.
L. Devroye, L. Gyorfi, and G. Lugosi, A Probabilistic Theory of Pattern Recognition, Springer, 1996
Machine Learning and Statistical Learning-Module Statistical Learning
Course syllabus
1.Introduction to Statistical Learning
2. Cross Validation and Bootstrap
3. Variable Selection, Ridge and Lasso Regression
4. Linear Models
5. Non Linear Models
6. Logistic Regression and classification Methods
7. Classification and Regression Trees, bagging, boosting and Random Forest
8. Unsupervised learning (Clustering, PCA)
9. Brief notes on neural networks (tentative)
10. Brief notes on the association rules (tentative)
2. Cross Validation and Bootstrap
3. Variable Selection, Ridge and Lasso Regression
4. Linear Models
5. Non Linear Models
6. Logistic Regression and classification Methods
7. Classification and Regression Trees, bagging, boosting and Random Forest
8. Unsupervised learning (Clustering, PCA)
9. Brief notes on neural networks (tentative)
10. Brief notes on the association rules (tentative)
Teaching methods
Lectures and Lab sessions
The goal of this module is to provide a methodological and practical overview to statistical learning methods. The emphasis is on the applications.
Optional group work will be offered to get familiar with the software and increase practical skills.
The goal of this module is to provide a methodological and practical overview to statistical learning methods. The emphasis is on the applications.
Optional group work will be offered to get familiar with the software and increase practical skills.
Teaching Resources
James, G., Witten, D., Hastie, T., & Tibshirani, R. (2021). An introduction to statistical learning, Springer.
A further reference is the textbook:
Hastie, T., Tibshirani, R., & Friedman, J. (2009). The elements of statistical learning: data mining, inference, and prediction. Springer Science & Business Media.
A further reference is the textbook:
Hastie, T., Tibshirani, R., & Friedman, J. (2009). The elements of statistical learning: data mining, inference, and prediction. Springer Science & Business Media.
Machine Learning and Statistical Learning-Module Machine Learning
INF/01 - INFORMATICS - University credits: 6
Lessons: 40 hours
Professor:
Cesa Bianchi Nicolo' Antonio
Machine Learning and Statistical Learning-Module Statistical Learning
SECS-S/01 - STATISTICS - University credits: 6
Lessons: 40 hours
Professor:
Salini Silvia
Professor(s)
Reception:
The student reception is in attendance, by appointment, on Friday from 09.30 to 11.00 and via Teams, by appointment, on Monday from 15.00 to 16.30.
DEMM, room 30, 3° floor