Vorlesung: Statistical Machine Learning

Quick Facts

Lecturer:Jan Peters
Teaching Assistants:Dorothea Koert, Svenja Stark, Hany Abdulsamad
Lectures:Wednesday, 13:30-17:00 in Room S103/226
 Friday, 03.05.2019, 09:50-11:30 in Room S202/C205 instead of Wednesday, 24.04.2019
 Friday, 21.06.2019, 13:30-15:10 in Room S202/C205 instead of Wednesday, 03.07.2019
 Monday, 22.07.2019, 16:00-18:00 in Room S202/C110 Discussion HW4
Office Hours:Monday, 14:00-15:00 in Room S202/E202 starting 13.05.2019
 Tuesday, 11.06.2019, 14:00-15:00 in Room S202/E203 instead of Monday, 10.06.2019 (Whitmonday)
 Or request by Email
TU-CAN:20-00-0358-iv Statistisches Maschinelles Lernen
Credits:6,0 ECTS (4 SWS)
Exam:Friday, 26.07.2019 13:00-15:00
 Room according to the first letter of your last name:
 A - G: S1 05/ 122
 H - M: S1 01/ A01
 N - Z: S2 06/ 030
Exam Review:Friday, 04.10.2019, 13:00-15:30 in Room S202/A126
Exam Repetition:Thursday, 12.3.2020, 14:30-16:30 in Room S1|03 07


As the World Wide Web keeps growing, computer science keeps evolving from its traditional form, slowly slowly becoming the art to create intelligent software and hardware systems that draw relevant information from the enormous amount of available data.

Why? Let's look at the facts: billions of web pages are at our disposal, videos with an accumulated time of 20 hours are uploaded every minute on Youtube and the supermarket chain Walmart alone performed more than one million transactions per hour, creating a database of more than 2.5 petabytes of information. John Naisbitt has stated the problem very clearly:

"We are drowning in information and starving for knowledge."

In the future of computer science, machine learning will therefore be an important core technology. Not only that, machine learning already is the technology which promises the best computer science jobs. Hal Varian, the Chief Engineer of Google in 2009 depicted it like this:

"I keep saying the sexy job in the next ten years will be statisticians and machine learners. People think I am joking, but who would have guessed that computer engineers would have been the sexy job of the 1990s? The ability to take data, to be able to understand it, to process it, to extract value from it, to visualize it, to communicate it, that is going to be a hugely important skill in the next decades. "

Accordingly, this lecture serves as an introduction to machine learning. Special emphasis is placed on a clear presentation of the lectures contents supplemented by small sample problems regarding each of the topics. The teacher pays particular attention to his interactions with the participants of the lecture, asking multiple question and appreciating enthusiastic students.


The course gives an introduction to statistical machine learning methods. The following topics are expected to be covered throughout the semester:

  • Probability Distributions
  • Linear Models for Regression and Classification
  • Kernel Methods, Graphical Models
  • Mixture Models and EM
  • Approximate Inference
  • Continuous Latent Variables
  • Hidden Markov Models


Math classes from the bachelor's degree, basic programming abilities, introductory classes to computer science.

Moodle Class

All further information and announcements regarding the lecture will be made public over the Moodle system of the computer science department: https://moodle.informatik.tu-darmstadt.de/course/view.php?id=595

Academic Honesty Policy

We grade homework such that you get early feedback on your performance. If you copy, you are dishonest, do waste our time and effort, do worse on the exam and violate all proper academic conduct. We will report you to the dean of studies office (Studiendekanat) and not count any of your bonus points.

Other Information

  • All course announcements go through moodle -- make sure to subscribe to the forums to not miss homework releases and important announcements
  • Programming guidelines for Python may help you get started with Python


The most important books for this class are:

  1. C.M. Bishop. Pattern Recognition and Machine Learning, Springer free online copy
  2. K.P. Murphy. Machine Learning: a Probabilistic Perspective, MIT Press

Additionally, the following books might be useful for specific topics:

  1. D. Barber. Bayesian Reasoning and Machine Learning, Cambridge University Press Free online copy
  2. T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning, Springer Verlag Free online copy
  3. D. MacKay. Information Theory, Inference, and Learning Algorithms, Cambridge University Press Free online copy
  4. R. O. Duda, P. E. Hart, and D. G. Stork. Pattern Classification, Willey-Interscience
  5. T. M. Mitchell. Machine Learning, McGraw-Hill
  6. R. Sutton, A. Barto. Reinforcement Learning - An Introduction, MIT Press free online copy
  7. M. Jordan. An Introduction to Probabilistic Graphical Models Free online copy

Additional Material

Here are some tutorials on relevant topics for the lecture.

  1. An overview of gradient descent optimization algorithms (Sebastian Ruder)
  2. Covariant/natural gradient notes (Marc Toussaint)
  3. Bayesian Linear Regression, video (Jeff Miller)
  4. GP, video (John Cunningham)
  5. GP Regression, video (Jeff Miller)
  6. SVM, pdf (Andrew Ng)
  7. SVM, video (Patrick Winston)

Some additional resources:


  • The homeworks will be given during the course.
  • Homeworks are not compulsory but we strongly suggest that you complete them as they provide hands-on experience on the topics.
  • The homework should be done in groups of two. These groups should remain unchanged for the remainder of the semester.
  • Homeworks can be handed in at any time before the deadline by dropping them in the mailbox outside Room E314@S2|02.
  • Homeworks will only be graded if they are submitted in time, that is
- Handed in before the delivery date in moodle, AND
- Left in the mailbox before the delivery date.
  • Only one copy per group has to be submitted.
  • Email delivery will be accepted exceptionally, only if a good reason exists.
  • We provide a Latex template for submitting your solutions (you need to use the TUD Latex template). We highly recommend you to use it!
  • If the exercise includes programming, then code is expected to be handed in (except if noted differently).
  • We use Python for programming assignments. You are allowed to use numpy and any plotting library you like (we recommend matplotlib). You are not allowed to use scikit-learn and scipy, unless explicitly stated otherwise.
  • We encourage the creation of nice plots (different symbols for different lines) as showing the results clearly is as important as obtaining them.
  • Include only important snippets of your code, not entire files. You can include snippets in the Latex template using the 'listings' package. Alternatively, if you use ipython/jupyter notebook, you can directly export your notebook (code + plots) as HTML and print it.
  • Your code must be briefly documented.
  • The assignment may be updated after its release if any issue is detected.
  • After the deadline there will be a presentation of the solutions.
  • Partial solutions (including only plots and numerical results, no code or explanation) will be published after the above presentation.
  • After the homeworks are corrected, they will be given back to you.
  • Successfully completing the homeworks will provide up to one additional bonus point for the final grade (only if you pass the exam) according to the following rule

{$ \min\left\lbrace \frac{\text{your homework points \: (including bonus)}}{\text{homework points \: (excluding bonus)}} \: , \: 1\right\rbrace $}

Final Exam

  • The final exam date will be announced during the semester.
  • The exam will cover all material presented in the lectures, unless specified otherwise.
  • The exam will consist of roughly 30 questions and will take 90 minutes.
  • Students are allowed to bring to the exam a cheat sheet consisting of a single A4 paper. The paper must be handwritten (not printed) and you can write on both faces.
  • Students are allowed to use a non-electronic dictionary and a non-programmable calculator.
  • You have to identify yourself with both a personal ID and the TU-ID.
  • This pdf shows you how the exam will look like.

Teaching Staff

Lectures will be held by Jan Peters and additionally supervised by Hany Abdulsamad, Dorothea Koert and Svenja Stark.

Jan Peters heads the Intelligent Autonomous Systems Lab at the Department of Computer Science at the TU Darmstadt. Jan has studied computer science, electrical, control, mechanical and aerospace engineering. You can find Jan Peters in the Robert-Piloty building S2 | 02 room E314. You can also contact him through mail@jan-peters.net.

Dorothea Koert joined the IAS Lab as a Phd student in May 2016. She is working on autonomous skill learning within the SKILLS4ROBOTS project. You can contact her via email doro@robot-learning.de.

Svenja Stark joined the IAS Lab as a Phd student in December 2016. She is working on adaptive skill libraries and skill comparison. You can contact her via email svenja@robot-learning.de.

Hany Abdulsamad joined the Intelligent Autonomous System lab in April 2016 as a PhD student. His research interests include optimal control, trajectory optimization, reinforcement learning and robotics. During his Phd, Hany is working on the SKILLS4ROBOTS project with the aim of enabling humanoid robots to acquire and improve a rich set of motor skills. You can contact him by email at hany@robot-learning.de .

For further inquiries do not hesitate to contact us immediately!