“Deep Learning” systems, typified by deep neural networks, are increasingly taking over all AI tasks, ranging from language understanding, and speech and image recognition, to machine translation, planning, and even game playing and autonomous driving. As a result, expertise in deep learning is fast changing from an esoteric desirable to a mandatory prerequisite in many advanced academic settings, and a large advantage in the industrial job market.
In this course we will learn about the basics of deep neural networks, and their applications to various AI tasks. By the end of the course, it is expected that students will have significant familiarity with the subject, and be able to apply Deep Learning to a variety of tasks. They will also be positioned to understand much of the current literature on the topic and extend their knowledge through further study.
The course is well rounded in terms of concepts. It helps us understand the fundamentals of Deep Learning. The course starts off gradually with MLPs and it progresses into the more complicated concepts such as attention and sequence-to-sequence models. We get a complete hands on with PyTorch which is very important to implement Deep Learning models. As a student, you will learn the tools required for building Deep Learning models. The homeworks usually have 2 components which is Autolab and Kaggle. The Kaggle components allow us to explore multiple architectures and understand how to fine-tune and continuously improve models. The task for all the homeworks were similar and it was interesting to learn how the same task can be solved using multiple Deep Learning approaches. Overall, at the end of this course you will be confident enough to build and tune Deep Learning models.
Instructor: Bhiksha Raj
TAs:
Lecture: Monday and Wednesday, 9.00am-10.20am
Location: Gates-Hillman Complex GHC 4102
Recitation: Friday, 9.00AM-10.20AM
Office hours: Schedule
This course is worth 12 units.
Grading will be based on weekly quizzes, homework assignments and a final project.
There will be five assignments in all. They will also be due on the same date.
| Maximum | |
| Quizzes | 12 quizzes (bottom 2 quiz scores will be dropped), total contribution to grade 24% |
| Assignments | 5 assignments, total contribution to grade 41% |
| Project | 1 project, total contribution to grade 35% |
The late policy will be such that with every day you are late, you would be eligible for a lower grade. For example, if the homework was due on 22nd and you would submit on 23rd then you would be only eligible for a B in the homework. The grades would keep dropping as the days go by.
The course will not follow a specific book, but will draw from a number of sources. We list relevant books at the end of this page. We will also put up links to relevant reading material for each class. Students are expected to familiarize themselves with the material before the class. The readings will sometimes be arcane and difficult to understand; if so, do not worry, we will present simpler explanations in class.
We will use Piazza for discussions. Here is the link. Please sign up.
You can also find a nice catalog of models that are current in the literature here. We expect that you will be in a position to interpret, if not fully understand many of the architectures on the wiki and the catalog by the end of the course.
Kaggle is a popular data science platform where visitors compete to produce the best model for learning or analyzing a data set.
For assignments you will be submitting your evaluation results to a Kaggle leaderboard.
| Lecture | Start date | Topics | Lecture notes/Slides | Additional readings, if any | Quizzes/Assignments |
|---|---|---|---|---|---|
| 1 | August 29 |
|
Quiz 1 | ||
| 2 | August 31 |
|
HW1 Released! | ||
| 3 | September 5 |
|
Quiz 2 | ||
| 4 | September 10 |
|
slides |
||
| 5 | September 12 |
|
Quiz 3 | ||
| 6 | September 17 |
|
slides | ||
| 7 | September 19 |
|
slides |
Quiz 3 | |
| 8 | September 24 |
|
slides |
HW 1 due HW 2 Released! |
|
| 9 | September 26 |
|
slides |
Quiz 5 | |
| 10 | October 1 |
|
|||
| 11 | October 3 |
|
Quiz 6 | ||
| 12 | October 8 |
|
|||
| 13 | October 10 |
|
Quiz 7 | ||
| 14 | October 15 |
|
HW 2 due HW 3 Released! |
||
| 15 | October 17 |
|
Quiz 8 | ||
| 16 | October 22 | Variational Autoencoders (VAEs) | |||
| 17 | October 24 |
|
Quiz 9 | ||
| 18 | October 29 |
|
HW 3 due HW 4 Released! |
||
| 19 | October 31 |
|
Quiz 10 | ||
| 20 | November 5 |
|
|||
| 21 | November 7 |
|
Quiz 11 | ||
| 22 | November 12 |
|
|||
| 23 | November 14 |
|
Quiz 12 | ||
| 24 | November 19 |
|
|||
| 25 | November 21 |
|
|||
| 26 | November 26 |
|
HW 4 due | ||
| 27 | November 28 |
|
|||
| 28 | December 3 |
|
| Recitation | Start date | Topics | Lecture notes/Slides |
|---|---|---|---|
| 1 | August 27 | Amazon Web Services (AWS) | |
| 2 | September 7 | Your first Deep Learning Code | |
| 3 | September 14 | Efficient Deep Learning/Optimization Methods | |
| 4 | September 21 | Convolutional Neural Networks | |
| 5 | September 28 | Debugging and Visualization | |
| 6 | October 5 | Basics of Recurrent Neural Networks | |
| 7 | October 12 | Recurrent networks 2: Loss functions, CTC | |
| 8 | October 19 | Attention | |
| 9 | October 26 | Research in Deep Learning | |
| 10 | November 2 | Variational autoencoders | |
| 11 | November 9 | GANs | |
| 12 | November 16 | Reinforcement Learning | |
| 13 | November 30 | Hopfield Nets, Boltzmann machines, RBMs |




