E. Formenti
AI: Models and Applications
S1 3 ECTS 24h OPT EN E. Formenti
- Instructor Office: Bureau 222, Laboratoire I3S
- Contact: appointments by email
Course description
The main goal of the course is to introduce the students to the different concepts of machine learning by neural networks (also several other models will be considered). For most subjects, first the formal aspects and concepts will be introduced and explained, then they will be illustrated through a series of programming exercises and experiments.
Cours structure
The course consists in 8 lectures of 3 hours each (2 hours of teaching and 1 hour of exercises). Exercises are most of the times programming sessions which aim at illustrating concept from the first part of the lecture.
Prerequisites
No special prerequisites for this course except for basic Python programming.
Course Goals
Students who complete this course successfully will be able to:
- analyse, polish and preprocessing data to prepare them for successful learning;
- choose an adapted learning model and algorithm;
- run a learning algorithm using the most popular libraries (Scikit-learn, Keras, TensorFlow);
- evaluate the results of a learning algorithm and compare them to those of other approaches.
The lectures
Lecture 1: the fundamentals of Input/Output systems. This a standard way of studying and explaining evolving systems which react according to their inputs. In particular, we will focus on additive systems.
Lecture 2: automata networks as I/O systems. Invariance theorems. Renormalisation. Topological conjugacy. Exercises and examples.
Lecture 3: the formal neuron of McCullog-Pitts. Learning: moving from a modeling to an optimisation problem. The perceptron. The XOR problem. Feed forward networks. The multilayer perceptron and the backpropagation algorithm. Stochastic gradient methods.
Lecture 4: Sensibility to input data or the necessity of having good input data. Solving the missing data problem. Categorical data. Scaling and partitioning. Feature selection algorithms.
Lecture 5: more on data dimension reduction. Principal component analysis. Linear discriminant analysis. Kernel methods for non-linear problems.
Lecture 6: more learning paradigms. Support vector machines. Decision trees. K-nearest neighbors.
Lecture 7: evaluating learning models and their results. Overfitting and underfitting. Hold-out vs. k-fold cross-validation methods. Optimizing hyperparameters by grid search. The confusion matrix explained. ROC curves.
Lecture 8: introduction to deep learning. What is deep in ‘deep learning’. Convolutional neural networks. Recurrent neural networks. Limits and perspectives of deep learning.
Suggested books and materials
No strict need for books, the slides of each lecture will be available online. However, students are invited to deepen the topics of the course by consulting the following books:
-
Stuart J. Russel and Peter Norvig, Artificial Intelligence: A Modern Approach, Pearson College Div publisher, 3rd edition, 1132 pages, 2010
-
François Chollet, Deep learning with Python, Manning publishing, 363 pages, 2017.
More material will be distributed during the lectures. In particular, all the slides will be available online.
Important: in practice sessions (1h per lecture) we will need to program (in Python). For a better experience, you are invited to bring your personal laptop, if possible. In this way, you’ll have a good installation for developing your project.
Course grading
During the practice sessions (programming) there will be one or two assignments. The average mark of them will provide a mark T. Another mark P will come from a project assignment which will be given and explained in Lecture 5. The overall grading is obtained by the formula (T+3P)/4 where P and T are in the range [0,20].
Requests for instructor feedback
Sometimes the notions or exercises given are a little bit cumbersome or hard to understand. Whenever this happens, students are invited to apply the following algorithm:
- try hard for some (finite) time;
- if Step 1. fails, then ask your class mates;
- if Step 2. fails, then ask to former students of the course (Master 2 students) if you know any of them;
- if Step 3. fails, then ask the instructor.
In any case, avoid to be blocked on an assignment or on a poorly understood notion. There is no penalty nor shame to ask for help.
Attendance policy
Students are strongly invited to be present to the lectures. However, no strict presence control is made and there is no penalty for absents.
Penalties for late work and requests for extensions
Late work will be penalized. The exact amount of the penalties will be announced when the project assignment is distributed. No extension is allowed, except for true exceptional cases and reasons.
Disclaimer
The instructor reserves the right to make modifications to this information throughout the semester.