Neural Networks for Data Science Applications (2020/2021)

Master Degree in Data Science (6 credits)

NOTE: For the year 2019/2020, please refer to this page.


+ This course is the recipient of a Google Faculty Award to Support Machine Learning Courses, Diversity, and Inclusion.

+ A Google Group is active to receive all info on the course, ask questions, and discuss with other students:!forum/neural-networks-for-data-science-applications-20202021

General overview

The course will introduce neural networks in the context of data science applications. After an overview on supervised learning and numerical optimization, we will describe recent techniques and algorithms (going under the broad name of “deep learning” or differentiable programming), that allows to successfully apply neural networks to a wide range of problems, e.g., in computer vision and natural language processing.

Students will be introduced to convolutional networks (e.g., for image analysis), to recurrent neural networks (for sequential problems), and to recent attention-based models. We will also introduce problems of robustness, fairness, and interpretability. Optional topics include graph-based model and generative architectures.

Theory will be supplemented by practical laboratories where all concepts will be developed on realistic use cases through the use of the TensorFlow 2.0 library.

Slides and notebooks

1TBDIntro to the courseTBD

Environment setup

Students are invited to bring their own laptop for the lab sessions. In order to have a working Python installation with all prerequisites, you can install the Anaconda distribution.

We will use TensorFlow 2.0 in the course, that you can install following the instructions from the website.

Alternatively, you can run all notebooks freely using the Google Colaboratory service (which you can access with a standard Gmail account or the account).

Reading material

The main reference book for the course is Dive into Deep Learning. Each slide will mention the corresponding sections in the book.