Machine Learning 2007/2008

Instructor: Tim van Erven
Teaching assistent: Rogier van het Schip

This is the main website for the machine learning course 2007/2008 at the vrije Universiteit. The course introduces the most important problems, ideas and methods from machine learning, which aims to design computer algorithms that improve automatically from their experiences.

In particular the interrelated problems of classification, regression and prediction are discussed, and we present the main machine learning approaches that address them. In analysing machine learning methods, the ideas of overfitting and the necessity of bias are of central importance. They form a recurring theme throughout the course.

Organisation (short version)

Course Location
Sept. 6 - Oct. 18 Thursday 9.00-10.45, room 04A05 (hgb)
Oct. 31 - Dec. 12 Wednesday 13.30-15.15, room KC159

Course materials: Machine Learning by Tom M. Mitchell, McGraw-Hill, 1997 and lecture notes, papers and tutorials. Don’t forget to study the slides and extra materials!

There used to be a more elaborate web page about the organisation of the course, but I did not port it when moving to a new system to manage my website.

Course Schedule

Date Topics Assignment Study
Sept. 6 Basic concepts in ML: supervised vs unsupervised learning, prediction, regression, classification (e.g. concept learning) - Ch. 1, Ch. 2 up to sect. 2.2
Sept. 13 Math: Introduction to matrices and vectors; Basic concepts in ML: data representation 1 Ch. 1, Ch. 2 up to sect. 2.2
Sept. 20 Basic concepts in ML: hypothesis spaces, necessity of bias; Methods: least-squares linear regression, list-then-eliminate algorithm - Ch. 2
Sept. 27 Math: directed graphs, trees, probability distributions; Method: decision trees 2 Ch. 2, Ch. 3
Oct. 4 Information theory: entropy, mutual information; Math: probability distributions and random variables; Methods: decision trees continued - Ch. 3
Oct. 11 Basic concepts in ML: train and test sets, overfitting; Methods: least squares regression with polynomials, pruning decision trees 3 Ch. 3
Oct. 18 Methods: neural networks (perceptron) - Ch. 4 up to sect. 4.4
Oct. 25 Intermediate exam: 11.00 - 13.00 in Q105 - Ch. 1, 2, 3 with slides
Oct. 31 Methods: neural networks (perceptron); Math: Gradient descent. Note the change to Wednesday and KC159! 4 Ch. 4 up to sect. 4.4
Nov. 7 Cancelled due to illness. -
Nov. 14 More about probability theory: Bayes' rule, independent events, random vectors; Methods: Instance-based learning (k-nearest neighbour), naive Bayes 5 Ch. 8 up to sect. 8.2, Sect. 6.9
Nov. 21 Weka demonstration and practical homework assignment description; Method: naive Bayes; Probability theory: distributions on the reals; Basic concepts in ML: models 6 Sect 6.9
Nov. 28 Methods: maximum likelihood parameter estimation, Bayesian learning - Sect. 6.7, Ch. 6 up to sect. 6.5.0
Dec. 5 Special guest lecture about Minimum Description Length (MDL) learning by Peter Grünwald, author of a comprehensive book about MDL. - (just the slides)
Dec. 12 Course summary, opportunity for questions; Student discussion about machine learning, in particular with respect to ethics (what about privacy?). Please read TDoML before the lecture! - TDoML
Dec. 20 Final exam: 18.30 - 21.15 in KC 07 -
Feb. 14 Resit of the final exam: 18.30 - 21.15 in G513 -

Ch. = Chapter from Mitchell; sect. = section; TBA = to be announced

Slides

The slides should be available on the morning before each lecture, except for the slides of the guest lecture.

Slides Lecture Comments
mlslides1-updated.pdf Sept. 6 To avoid confusing you, I've removed the slides at the end that we did not cover in class. The original slides are still available, although I don't see why you would need them.
mlslides2.pdf Sept. 13 We did not cover the last example on these slides (EnjoySport) in class. I recommend that you study it on your own.
mlslides3-corrected.pdf Sept. 20 Corrected `Warm' to `Cloudy' on slides 19 and 26. The original slides are still available.
mlslides4-corrected.pdf Sept. 27 Corrected `Warm' to `Cloudy' on slide 5. The original slides are still available.
mlslides5.pdf Oct. 4 Changed `log' to `log2' everywhere for clarity. The original slides are still available.
mlslides6.pdf Oct. 11
mlslides7.pdf Oct. 18 We only covered part of the slides in this lecture, because we spent our time discussing the answers to the second set of homework exercises. Therefore the things we did not cover are repeated in mlslides8.pdf. We did not cover "Linear functions as inner products", "Implementing Boolean functions with a perceptron", "Convex functions" or "Gradient descent (part 1)".
mlslides8.pdf Oct. 31 I suggest that you study the slides about optimizing perceptron weights, which we did not have time to cover during the lecture, on your own. They will help you understand the corresponding parts of Mitchell.
mlslides9.pdf Nov. 14
mlslides10.pdf Nov. 21 In the lecture we only got to slide 16. Studying the rest is optional. Some of it is covered in the next lecture.
mlslides11.pdf Nov. 28 We got to slide 28, and along the way I already told you what is on slide 31. Studying the other slides is optional.
mlslides12-part1.pdf mlslides12-part2.pdf Dec. 5 No slides originally, because Peter did his lecture on the blackboard. You might have taken notes, but to help you study I have made slides after his lecture based on Peter's notes.
mlslides13.pdf Dec. 12 Added missing curly braces on slide 8.

Homework Assignments

You have to submit assignments using Blackboard: Under Course Documents click on the name of the assignment and you will get a form that allows you to upload your file.

Assignment Deadline Answers Comments
Ex. 1 [corrected version, alternative set] Sept. 18 corrected version, alternative set You may choose between (a) the original set of exercises (with corrections and additional hints), and (b) an alternative set of exercises that will be easier but more work. Please indicate your choice at the top of your answer submission.
Ex. 2 (corrected version) Oct. 4 Ex.2 answers In the first exercise, I've corrected `Warm' to `Cloudy' in the enumeration of the hypothesis space. I've sent you an e-mail about this with an explanation.
Ex. 3 Oct. 18 Ex.3 answers
Ex. 4 Nov. 8 Ex.4 answers
Ex. 5 Nov. 22 Ex.5 answers
Ex. 6 (practical) Dec. 13 N.B. This will be a lot of work. You are allowed to work together in groups of at most three students. You should also download the data sets: splice-test.arff, splice-train.arff, splice-validation.arff. Note that the deadline has been extended by one week.

Exams

Intermediate Exam

The small group of students who took the intermediate exam at 15.15 got a different version: version B. The other students got the normal version.

Version Date Answers
normal version Oct. 25 answers normal version
version B Oct. 25 answers version B

Final Exam

A few exchange students took the final exam one week earlier. They got a different version.

Version Date Answers
normal version and its Dutch translation Dec. 20 answers normal version
resit-of-intermediate-exam version and its Dutch translation Dec. 20 answers resit-of-intermediate-exam version
early group Dec. 14 answers early group version

Resit Exam

Version Date Answers
resit exam and its Dutch translation Feb. 14 answers

Further Reading

If you want to read more on the topics from class, here are some pointers. None of this is mandatory.

Reference Description
Leo Dorst, Vectors for Beginners This is a slightly more extensive introduction to vectors than we had time for in class. Note that it uses different notation. In class we have introduced vectors at the symbolic and computational level, as Leo calls them. To gain further intuition it is necessary to also give them a geometric interpretation. This (still very brief) introduction discusses all three interpretations. If you have not seen vectors before, reading this will require serious effort.
Chapter 1 from the MDL book Chapter 1 (and 17) of Peter Grünwald's book are available for free as sample chapters. If you want to learn more about MDL, I recommend that you read chapter 1, although it may require serious effort. Peter also has a tutorial about MDL available if you really want to get into it.