Machine Learning (SS 2018)
- 18.05.2018: There will be no tutorials next week. The new exercise sheet handed out this week will be discussed the week after.
- 02.05.2018: Please note that the lecture on May 17th will start an hour earlier. It will be from 8 am to 10 am
- 25.04.2018: There will be no tutorials next week. The new exercise sheet handed out this week will be discussed the week after.
- 13.04.2018: The lecture will be recorded. You can find the lecture videos at VideoOnline.
- 16.02.2018: The registration for this course via UniWorx will be open from March 1st, 2018 onwards.
- Course: 3+2 hours weekly (equals 6 ECTS)
- Lecture: Prof. Dr. Volker Tresp
- Assistants: Christian Frey, Julian Busch
- Required: professional skill of at least one programming language
- Audience: The course is directed towards master students in informatics, bioinformatics and media informatics
Time and Locations
All times are c.t. (cum tempore)
|Lecture||Thu, 9,00 - 12,00 h||Room S 002 (Schellingstr. 3)||12.04.2018|
|Tutorial 1||Tue, 14,00 - 16,00 h||Room B 015 (Geschwister-Scholl-Platz 1)||17.04.2018|
|Tutorial 2||Tue, 16,00 - 18,00 h||Room B 015 (Geschwister-Scholl-Platz 1)||17.04.2018|
|Tutorial 3||Wed, 14,00 - 16,00 h||Room B 015 (Geschwister-Scholl-Platz 1)||18.04.2018|
|Tutorial 4||Wed, 16,00 - 18,00 h||Room B 015 (Geschwister-Scholl-Platz 1)||18.04.2018|
Machine Learning is a data-driven approach for the development of technical solutions. Initially motivated by the adaptive capabilities of biological systems, machine learning has increasing impact in many fields, such as vision, speech recognition, machine translation, and bioinformatics, and is a technological basis for the emerging field of Big Data.
The lecture will cover:
- Supervised learning: the goal here is to learn functional dependencies for classification and regression. We cover linear systems, basis function approaches, kernel approaches and neural networks. We will cover the recent developments in deep learning which lead to exciting applications in speech recognition and vision.
- Unsupervised Learning: the goal here is to compactly describe important structures in the data. Typical representatives are clustering and principal component analysis
- Graphical models (Bayesian networks, Markov networks), which permit a unified description of high-dimensional probabilistic dependencies
- Reinforcement Learning as the basis for the learning-based optimization of autonomous agents
- Some theoretical aspects: frequentist statistics, Bayesian statistics, statistical learning theory
The technical topics will be illustrated with a number of real-world applications.
|12.04.18||Lecture 1: Introduction||17.04.18
|Python Introduction (.ipynb)
Suggested Solutions (.ipynb)
|19.04.18||Lecture 2: Linear Algebra (Review), Perceptron, Linear Regression||24.04.18
|Exercise Sheet 1
Exercise 1-3 (.ipynb)
Exercise 1-3 Solutions (.ipynb)
|26.04.18||Lecture 3: Basis Functions, Neural Networks||01.05.18
|no tutorials (May Day)|
|03.05.18||Lecture 4: Deep Learning, Manifolds||08.05.18
|Exercise Sheet 2
Tensorflow Introduction (.ipynb)
Exercise 2-5 (.ipynb)
Exercise 2-5 Solutions (.ipynb)
|10.05.18||no lecture (Ascension Day)||15.05.18
|Exercise Sheet 3
Exercise 3-3 (.ipynb)
Exercise 3-3 Solutions (.ipynb)
|Lecture 5: Kernels||22.05.18
|no tutorials (Whit Tuesday)|
|Exercise Sheet 4
Exercise 4-1 (.ipynb)
|31.05.18||no lecture (Corpus Christi)||05.06.18