Algorithms for Machine Learning & Inference

Course code: TDA231/DIT380


  • We provide two slots to check the final exam grades: Thursday, June 28th, 11-12,  and Thursday, June 28th, 14-15.  Location: EDIT 6446 (Morteza Chehreghani).
  • June 1st: The exam solution is uploaded here.
  • May 29:  Please download the exam here (to be done individually)
  • May 22 Lecture starts 10:30.
  • May 25: Guest lecture by Hannes Ericsson (Zenuity) on reinforcement learning for autonomous driving
  • May 22: Guest lecture by Hans Salomonsson (Machine Intelligence Sweden) on reinforcement learning for AlphaGo and AlphaGoZero.
  • May 18: Guest Lecture by Richard Sproat (Google Research) on “Neural Models for Speech” in Palmstedssalen 10:30-12.
  • May 11. All homework solutions until now (hw-3) can be downloaded from this link.
  • Exam Date: released May 29 10 AM, due May 30 4 PM.
  • April 9. Please note that all assignments/homeworks have to be done in IPython Notebook environment (using python), with the exception of homework-3, which has to be done in Matlab.
  • Doodle poll for Exam date, please mark dates that absolutely DO NOT work for you: Exam date.
  • March 26. Please note that technical questions related to your solution (For example, your code or theoretical solution) will not be answered by email or on Piazza. They will only be answered (discussed with you) during consultations sessions. By email, you can only ask clarification questions regarding the assignments. Or if you spot some mistakes/confusions in the assignment instructions.
  • HWO: Theoretical question “Setting Hyper parameters” is now a Bonus question. The last line “Confirm that this gives the same values claimed in the lecture” is not valid, as nothing was claimed in the lecture.
  • Please direct all questions about homework assignments to the TAs. It is recommended to send your questions from piazza for faster answers and to reach all TAs simultaneously.
  • If you’re not intending to continue with the course, please drop out officially – there are many on the waiting list who would like to get a place in the course!
  • March 21. If you are still looking for a teammate, please  join the class discussion groups on piazza here (Use the access code: suttl), then go to the “Search for teammates”  discussions and either create a new post or reply to an existing one. Once you found a teammate, you can mark your search as Done. If you have any issues joining the discussion board, email Aristide.
  • March 20. Assignment for first week is now online. See below.
  • March 19. The link to FIRE is now updated and live. Please create an account on it as a student and team up in pairs to solve the assignments.
  • March 7. Web page for 2018 is live. Stay tuned for updates.

What It’s About

Today we have entered the era of “Big Data” : science, engineering and technology are producing increasingly large data streams, with petabyte and exabyte scales becoming increasingly common. We are flooded with data from the internet, social networks like Facebook and Twitter and high throughput experiments from Biology and Physics labs. Machine Learning is an area of Computer Science which deals with designing algorithms that allow computers to automatically make sense of this data tsunami by extracting interesting patterns and insights from raw data. The goal of this course is to introduce some of the fundamental concepts, techniques and algorithms in modern Machine Learning with special emphasis on Statistical Pattern Recognition. The first few lectures will introduce fundamental concepts, in particular the Bayesian approach, and in the rest we will see them applied to paradigm topics including:

  • Supervised Learning: Bayes Classifier, Support Vector machines, Regression.
  • Unsupervised Learning: Clustering Algorithms, EM algorithm, Mixture models, Kernel methods.
  • Deep Learning: Artificial neural networks, Back-propagation, Convolutional NNs, Recurrent NNs, Deep reinforcement learning
  • Graphical Models: Hidden Markov models, Belief propagation, variational methods, MCMC.



  • Devdatt Dubhashi
  • Morteza Chehreghani


  • Mikael Kågebäck (kageback (at), lectures, consultation)
  • Divya Grover (grover (at), consultation, grading)
  • Vasileios Athanasiou (vasath (at), grading)
  • Aristide Tossou (aristide (at), grading)

Course literature

The course book is S. Rogers and M. Girolami, A First Course in Machine Learning, 2nd edition, Chapman & Hall/CRC 2016, ISBN: 9781498738484.

Student Representatives

  • Sandra Viknander <>
  • Shruthi Dinakaran <>


  • Tuesdays and Fridays 10-12, Mostly in HA4.
    See schedule in Timeedit for details
  • Consulting Tuesdays and Fridays 13:15 – 14:00 when scheduled (See Timeedit “consultation time”)


For the final grade, the points are normalized and regular passing grades apply. Max total score for all homework assignments: 120. Max total score for exam: 60.

Weighting of the scores:
total_score = total_homework_scores/4 + total_on_exam/2

Grade levels:
28 (3,G) 36 (4) 48 (5, VG)

Take-home exam:

  • Exam release date: May 29 10 AM
  • Exam due date: May 30 4 PM.   Hand in the exam to Morteza Chahreghani (office no. 6446) at 1600 on May 30th.  Direct any questions/clarifications on exam by email to Morteza Chehreghani:

Practice exams:


Elementary probability, linear algebra and multivariate calculus. You should be able to program in Python and MATLAB. Previous algorithms courses are valuable though not strictly necessary. Here are some refreshers:

See also [Bar Ch. 29] for a refresher on linear algebra and multivariate calculus, and [Bar Ch.1] for a probability refresher.

Python resources: Python tutorial using IPython.

Matlab resources: Matlab tutorial.


Lecture slides will appear here as the course progresses, together with recommendations for reading.

Note that the lecture slides are subject to change until the day of the lecture.

Day Main lecture topics and slides Slides             Recommended reading Room for consultation
Mar. 20 Machine Learning – What, Why and How?Linear Modeling Introduction        Lecture 1  [RG 1.1] [Bar 17.1]   N/A
Mar. 23 Non-linear model and model selection Lecture 2  [RG 1.2 – 1.5] [Bar. 17.2]  N/A
Mar. 26 Linear Regression: Modelling the noise Lecture 3aLecture 3bLecture 3c  [RG 1.2 – 1.5, 2.10.3, 3.8][Bar. 8.8, 10.1-10.3]  N/A
April 10  Conjugate priors cont’d, Classification I  Lecture 4a   Lecture 4b  [RG 3.8, 5.1, 5.2.1][Bar. 8.8, 10.1-10.2]  EL43
April 13  Classification I cont’d  Lecture 4a   Lecture 4b  [RG 5.1, 5.2.1][Bar. 10.1-10.2]  EL43
April 17  Classification II: Logistic Regression  Lecture 5  [RG 5.2.2][Bar. 17.4.1]  EL43
April 20  Softmax Regression and Feed Forward Neural Networks  Lecture 6aLecture 6b  [GBC, Ch. 6,8]  EL43
April 24  CNNs and RNNs  Lecture 7a  Lecture 7b  [GBC, Ch. 9, 10]  EL43
April 27  SVM I: Large Margin  Lecture 8  [RG 5.3.2] [Bar 17.5  EL43
May 4  SVM II: Kernel Methods   Lecture 8  [RG 5.3.2] [Bar 17.5]              ML13
May 8  Clustering and K-means Lecture 9  [RG  6.1 – 6.2], [Bar, 20.3.5]
May 15  Mixture Models and EM  Lecture 10  [RG  6.3], [Bar, 20.1 – 20.3 ]
May 18  Guest Lecture: Richard Sproat (Google Research) Palmstedtsalen 10:30-12
May 21  AlphaGO: Reinforcement Learning   Alpha Go Lecture
May 25  Reinforcement Learning for Autonomous Driving at Zenuity

Homework assignments

Link to FIRE: .

  • All assignments will be posted here.
  • We will use either Python3 or Matlab.
  • Note that all homework are in Jupyter/IPython Notebook (they are not pdf). This will be used in all our Homework Assignments except for Neural Network assignment which is  based on matlab. It is installed in the halls ES61-ES62, E-studio and MT9. You can also use google-colab to open/run these notebooks.
  • All assignments are to be solved in pairs. If you don’t have a partner,  join the class discussion board here  (with access code: suttl ) and create a new announcement in “Search for teammate” or reply to one such announcement. If you have any issues with this, please contact Aristide.
  • Each homework consists of both theoretical and practical problems.
  • You will need to upload two things to FIRE,
    • One .pdf-file containing the solutions to the theoretical problems. Optionally, you can write the solutions in the  IPython notebook itself, using Latex-math mode for writing equations etc. If you choose to write your solutions to the theoretical problems in the IPython notebook itself, you don’t need to upload the .pdf-file.
    • The updated IPython notebook with discussions/results of practical questions (including results and/or plots and outputs of your code).
  • Assignment .pdf:s may be subject to minor changes (such as spelling corrections) up until one week before the deadline.
Homework Due date Datasets Code skeletons Grader Solution sketch
 hw0 Mar. 26 dataset0.txt  Aristide and Mikael
 hw1 April 16 dataset0.txt  Vasileios
hw2 April 23 dataset2.txt  Divya
 hw3 May 07 data.mat, net.m  Mikael
hw4 May 14 d1.txt , d2.txt  Vasileios  Solution Sketch
hw5 May 21 hw5_p1a.mat , hw5_p1b.mat  Aristide   Solution Sketch

[1]. hw downloadable from any modern HTML5 supporting browser.

Machine learning software

In this course, the practical homework assignments are designed to be solved with IPython notebook environment.

When working with the topics you’ve learned in this course however, you don’t need to write all parts of the implementation yourself. There are many mature libraries to use for applying machine learning in general. Two libraries that strive to be comprehensive, and therefore has many implemented algorithms, are scikit-learn (Python) and Weka (Java).

For deep learning, e.g. TensorFlow (Python), Theano (Python), Torch (Lua) and DL4J (Java).

If you are working in a context that uses the Apache big data tools, e.g. the Hadoop ecosystem, there are machine learning libraries on top of these general processing frameworks. Most notable Mahout, Spark, and FlinkML.

It is important to stress that using these libraries is fairly easy, but without the proper theoretical understanding of what is happening behind the scenes, you will not have the same success applying and extending these tools to your specific problem. This is why we will not use these libraries directly in this course but we will provide pointers for you to try them out if you’re interested.