Neural Networks – A.A. 2013 / 2014
Neural Networks – A.A. 2012/2013
This page contains the material for the lectures that I held in the Neural Networks course taught by Prof. Aurelio Uncini for the M.Sc. in Artificial Intelligence and Robotics. For general informations about the course please refer to the original web pages.
This section of the course will cover the ideas behind kernel methods in Machine Learning, especially Support Vector Machines. To this end, we will also explore the fundamental concepts of Statistical Learning Theory, such as the consistency of learning algorithms and their generalization capabilities, and of Tikhonov regularization theory. We will conclude by looking at other views on learning, such as the Bayesian or the Information-Theoretic frameworks.
For additional material on each topic, the slides are always concluded by a reference section with a selection of articles, books and other informations.
(click on the lecture title to download the slides):
|Lecture 0-A||Elements of Unconstrained Optimization (definitions of unconstrained optimization; characterization of minima; first and second-order methods; conjugate gradient methods.)
Additional material: Example of Unconstrained Optimization in Matlab
|Lecture 0-B||Elements of Constrained and Convex Optimization (definitions of constrained and convex optimization; characterization of minima of convex OP; duality theory; KKT conditions; penalty and barrier methods.)
Additional material: Example of Constrained Optimization in Matlab
|Lecture 1||Learning Problem and Loss Functions (definition of a learning algorithm; Empirical Risk Minimization; examples of loss functions)||17/10/2013|
|Lecture 2||Consistency of Learning Algorithms (key theorem of Statistical Learning Theory; VC Dimension; Structural Risk Minimization)||24/10/2013|
|Lecture 3||Geometrical Derivation of Support Vector Machines (geometric margin; maximal margin hyperplane; primal and dual optimization problems)
Additional material: Example of an SVM in Matlab
|Lecture 4||Kernels, Feature Spaces and Reproducing Kernel Hilbert Spaces (elements of functional analysis; definitions of a kernel; Reproducing Kernel Hilbert Spaces)||07/11/2013|
|Lecture 5||Regularized Learning Methods (regularization theory; kernel ridge regression; Bayesian learning; model selection)||21/11/2013|
The course requires a small project to be performed by the students, eventually in groups of maximum 3 people. Several projects are available on the topics discussed in class. Please contact me if you are interested.