SS 2019

Mathematics of Deep Learning

 

Mathematics of Deep Learning


Summer Semester 2019
Dr. Antoine Venant
Jonas Groschwitz

Seminar, BSc and MSc

Place and Time:

Info on formalities here.


In this seminar we will examine the mathematical principles that go into deep learning, and will work to understand why and how neural networks work the way they do. All this is from the perspective of application to computational linguistics. We will look at methods for gradient descent, feed-forward and recurrent neural nets, attention, and other methods.

Grades will depend on your talk, your seminar paper (if you decide to write one), and your participation in the discussions in class.

The seminar is open to BSc and MSc students. The seminar presupposes knowledge of the basics ideas of derivatives, linear algebra, neural networks and machine learning (some of which can be picked up along the way, if you’re not sure, drop by and find out!). Essential for enjoyment of the seminar is a willingness to look at the math, and a desire to get to the bottom of things. The seminar is designed to be complementary to the class Neural Networks: Implementation and Application.

Here is the literature list


You’ll find more information on term papers here.