Machine Learning
A.Y. 2024/2025
Learning objectives
The goal of the course is to discuss automatic methods to make predictions and build models starting from available data. The course will teach the student the theoretical bases of machine learning (fundamentals of statistical learning theory, classification, regression) and common methods for typical tasks (e.g., clustering and dimensional reduction).
Expected learning outcomes
The student will be able to analyse data choosing the most appropriate method among well-established ones. Moreover, they will be familiar with the notions and the language which is common to the disciplines that employ such methods (e.g., computer science, economy, mathematics).
Lesson period: First semester
Assessment methods: Esame
Assessment result: voto verbalizzato in trentesimi
Single course
This course can be attended as a single course.
Course syllabus and organization
Single session
Responsible
Lesson period
First semester
Course syllabus
The topics to be covered are shown here together with a (partial) list of the most important machine learning techniques introduced during lectures:
- Elements of statistics (covariance , correlation, Bayes theorem)
- Regression (linear, kernels and regularisation)
Maximum likelihood
Nearest neighbours (k-NN)
Statistical interpretation of regression
- Bias-variance decomposition and resampling methods
- Classification (linear, quadratic, logistic; sigmoid and softmax; cross entropy)
- Mercer's kernels
- Optimisation methods, stochastic gradient descent, natural gradients
- Basic neural networks (perceptron, feed-forward, universal approximation, back-propagation)
- Basic notions about advanced neural networks (solution of ODE, convolutional, recursive)
- Restricted Boltzmann machines
- Reduction of dimensionality (curse of dimensionality, PCA)
- Further classification and regression methods (decisional trees, random forests, support vector machines)
- Other stochastic and adaptive learning methods (padding, bagging, boosting, etc )
- Elements of statistics (covariance , correlation, Bayes theorem)
- Regression (linear, kernels and regularisation)
Maximum likelihood
Nearest neighbours (k-NN)
Statistical interpretation of regression
- Bias-variance decomposition and resampling methods
- Classification (linear, quadratic, logistic; sigmoid and softmax; cross entropy)
- Mercer's kernels
- Optimisation methods, stochastic gradient descent, natural gradients
- Basic neural networks (perceptron, feed-forward, universal approximation, back-propagation)
- Basic notions about advanced neural networks (solution of ODE, convolutional, recursive)
- Restricted Boltzmann machines
- Reduction of dimensionality (curse of dimensionality, PCA)
- Further classification and regression methods (decisional trees, random forests, support vector machines)
- Other stochastic and adaptive learning methods (padding, bagging, boosting, etc )
Prerequisites for admission
Basic mathematical knowledge (statistics, calculus, linear algebra).
Object-oriented computing at an introductory level.
Object-oriented computing at an introductory level.
Teaching methods
The teaching will based on lectures delivered in the classroom plus a few tutorial sessions where the student will see examples of applications and coding techniques.
Teaching Resources
The main reference material is represented by the lecture notes taken by the students during the teaching plus additional notes and computing examples provided by the teacher.
The material taught during lectures is also covered by different textbooks, depending on the topic. For reference, the most relevant monographies are:
- T. Hastie, R. Tibshirani, J. Friedman, "The Elements of Statistical Learning: Data Mining, Inference, and Prediction".
- A. Géron, "Hands-On Machine Learning with Scikit-Learn, Keras, and Tensorflow: Concepts, Tools, and Techniques to Build Intelligent Systems".
- K. P. Murphy, "Machine Learning: a Probabilistic Perspective".
The teaching material is available through the Ariel website for the course.
The material taught during lectures is also covered by different textbooks, depending on the topic. For reference, the most relevant monographies are:
- T. Hastie, R. Tibshirani, J. Friedman, "The Elements of Statistical Learning: Data Mining, Inference, and Prediction".
- A. Géron, "Hands-On Machine Learning with Scikit-Learn, Keras, and Tensorflow: Concepts, Tools, and Techniques to Build Intelligent Systems".
- K. P. Murphy, "Machine Learning: a Probabilistic Perspective".
The teaching material is available through the Ariel website for the course.
Assessment methods and Criteria
Assigned coursework to be handed in before the exam plus oral examination.
The coursework can be chosen between a computational project or the presentation of an advanced theoretical topic.
The coursework can be chosen between a computational project or the presentation of an advanced theoretical topic.
FIS/03 - PHYSICS OF MATTER - University credits: 3
FIS/04 - NUCLEAR AND SUBNUCLEAR PHYSICS - University credits: 3
FIS/04 - NUCLEAR AND SUBNUCLEAR PHYSICS - University credits: 3
Lessons: 42 hours
Professor:
Barbieri Carlo
Educational website(s)
Professor(s)
Reception:
Tue 14:00-15:00 (during the semester), or email me anytime for an appointment
My office is on floor 1 of LITA building, Phys. Dept., Via Celoria 16