Assignment | Deadline | Description | Links |
---|---|---|---|
Homework 0 | January 19 | A Python and PyTorch Primer | Handout (*.targ.gz) |
“Deep Learning” systems, typified by deep neural networks, are increasingly taking over all AI tasks, ranging from language understanding, and speech and image recognition, to machine translation, planning, and even game playing and autonomous driving. As a result, expertise in deep learning is fast changing from an esoteric desirable to a mandatory prerequisite in many advanced academic settings, and a large advantage in the industrial job market.
In this course we will learn about the basics of deep neural networks, and their applications to various AI tasks. By the end of the course, it is expected that students will have significant familiarity with the subject, and be able to apply Deep Learning to a variety of tasks. They will also be positioned to understand much of the current literature on the topic and extend their knowledge through further study.
If you are only interested in the lectures, you can watch them on the YouTube channel listed below.
The course is well rounded in terms of concepts. It helps us understand the fundamentals of Deep Learning. The course starts off gradually with MLPs and it progresses into the more complicated concepts such as attention and sequence-to-sequence models. We get a complete hands on with PyTorch which is very important to implement Deep Learning models. As a student, you will learn the tools required for building Deep Learning models. The homeworks usually have 2 components which is Autolab and Kaggle. The Kaggle components allow us to explore multiple architectures and understand how to fine-tune and continuously improve models. The task for all the homeworks were similar and it was interesting to learn how the same task can be solved using multiple Deep Learning approaches. Overall, at the end of this course you will be confident enough to build and tune Deep Learning models.
Instructor:
TAs:
Lecture: Monday and Wednesday, 9:00 a.m. - 10:20 a.m. @ BH A51
Recitation: Friday, 9.00am-10.20am BH A51
Office hours:Day | Time | Location | TA |
Monday | 10:30 - 11:30 am | GHC 5417 | Bhuvan Agrawal |
1:00 - 2:00 pm | GHC 6404 | Advait Gadhikar | |
3:00 - 5:00 pm | GHC 6708 | Zhefan Xu | |
3:30 - 4:30 pm | GHC 6708 | Yang Xia | |
Tuesday | 12:00 - 1:00 pm | GHC 6708 | Amala Sanjay Deshmukh, Soumya Vadlamannati |
1:00 - 2:00 pm | GHC 5417 | Hao Liang | |
3:00 - 4:00 pm | GHC 5417 | Jianfeng Xia, Yuying Zhu | |
Wednesday | 10:30 - 11:30 am | GHC 5417 | Bhuvan Agrawal |
2:00 - 3:00 pm | LTI Commons | Jianfeng Xia | |
3:00 - 4:00 pm | LTI Commons | David Park | |
6:00 - 9:00 pm | GHC 5417 | Christopher George | |
Thursday | 12:00 - 1:00 pm | GHC 6404 | Amala Sanjay Deshmukh, Soumya Vadlamannati, Rohit Prakash Barnwal |
1:00 - 2:00 pm | GHC 6404 | Hao Liang | |
Friday | 10:30 - 11:30 am | LTI Commons | Yang Xia |
11:00 - 12:00 am | LTI Commons | Rohit Prakash Barnwal | |
1:30 - 2:30 pm | GHC 5417 | Yuying Zhu | |
3:00 - 4:00 pm | GHC 6708 | David Park | |
Saturday | 2:00 - 4:00 pm | GHC 5417 | Advait Gadhikar, Vedant Sanil, Yash Belhe |
Sunday | 3:00 - 5:00 pm | GHC 5417 | Vedant Sanil |
Lecture: Monday and Wednesday, 3:00 p.m. – 4:20 p.m. @ CMR C421
Office hours:11-785 is a graduate course worth 12 units. 11-485 is an undergraduate course worth 9 units.
Grading will be based on weekly quizzes (24%), homeworks (51%) and a course project (25%).
Policy | ||
Quizzes |
There will be weekly quizzes.
|
|
Assignments | There will be five assignments in all. Assignments will include autolab components, where you must complete designated tasks, and a kaggle component where you compete with your colleagues.
| |
Project | All students are required to do a course project. The project is worth 25% of your grade | |
Final grade | The end-of-term grade is curved. Your overall grade will depend on your performance relative to your classmates. | |
Pass/Fail | Students registered for pass/fail must complete all quizzes, HWs and the project. A grade equivalent to B- is required to pass the course. | |
Auditing | Auditors are not required to complete the course project, but must complete all quizzes and homeworks. We encourage doing a course project regardless. | |
End Policy |
Piazza is what we use for discussions. You should be automatically signed up if you're enrolled at the start of the semester. If not, please sign up. Also, please follow the Piazza Etiquette when you use the piazza.
AutoLab is what we use to test your understand of low-level concepts, such as engineering your own libraries, implementing important algorithms, and developing optimization methods from scratch.
Kaggle is where we test your understanding and ability to extend neural network architectures discussed in lecture. Similar to how AutoLab shows scores, Kaggle also shows scores, so don't feel intimidated -- we're here to help. We work on hot AI topics, like speech recognition, face recognition, and neural machine translation.
YouTube is where all lecture and recitation recordings will be uploaded. Links to individual lectures and recitations will also be posted below as they are uploaded. Videos marked “Old“ are not current, so please be aware of the video title.
CMU students can also access the videos Live from Media Services or Recorded from Media Services.
The course will not follow a specific book, but will draw from a number of sources. We list relevant books at the end of this page. We will also put up links to relevant reading material for each class. Students are expected to familiarize themselves with the material before the class. The readings will sometimes be arcane and difficult to understand; if so, do not worry, we will present simpler explanations in class.
You can also find a nice catalog of models that are current in the literature here. We expect that you will be in a position to interpret, if not fully understand many of the architectures on the wiki and the catalog by the end of the course.
Lecture | Date | Topics | Lecture Slides and Video | Additional Readings (if any) | Homework & Assignments |
---|---|---|---|---|---|
0 | - |
|
Slides (*.pdf) Video (url) |
||
1 | January 15 |
|
Slides (*.pdf) Video (url) |
||
2 | January 17 |
|
Slides (*.pdf) Video (url) |
Hornik et al. (1989) Shannon (1949) |
|
— | January 20 |
|
|||
3 | January 22 |
|
Slides (*.pdf) Video (url) |
Widrow and Lehr (1992) Convergence of perceptron algorithm |
|
4 | January 27 |
|
|||
5 | January 29 |
|
|||
6 | February 3 |
|
|||
7 | February 5 |
|
|||
8 | February 10 |
|
|||
9 | February 12 |
|
|||
10 | February 17 |
|
|||
11 | February 19 |
|
|||
12 | February 24 |
|
|||
13 | February 26 |
|
|||
14 | March 2 |
|
|||
15 | March 4 |
|
|||
— | March 9 |
|
|||
— | March 11 |
|
|||
16 | March 16 |
|
|||
17 | March 18 |
|
|||
18 | March 23 |
|
|||
19 | March 25 |
|
|||
20 | March 30 |
|
|||
21 | April 1 |
|
|||
22 | April 6 |
|
|||
23 | April 8 |
|
|||
24 | April 13 |
|
|||
25 | April 15 |
|
|||
26 | April 20 |
|
|||
27 | April 22 |
|
|||
28 | April 27 |
|
|||
29 | April 29 |
|
Recitation | Date | Topics | Notebook | Videos | Instructor |
---|---|---|---|---|---|
0 - Part A | January 5 | Fundamentals of Python | Notebook (*.tar.gz) |
YouTube (url) |
Joseph Konan |
0 - Part B | January 5 | Fundamentals of NumPy | Notebook (*.tar.gz) | YouTube (url) | Joseph Konan |
0 - Part C | January 5 | Fundamentals of Jupyter Notebook | Notebook (*.tar.gz) | YouTube (url) | Joseph Konan |
0 - Part D | January 5 | AWS. Will include tutorial, with google doc polling to check student status | Doc (url) | YouTube (url) | Christopher George |
1 | January 13 | Your First Deep Learning Code | Notebook (*.zip) | YouTube (url) | Bhuvan, Soumya |
2 | January 24 | How to compute a derivative | Amala, Yang | ||
3 | January 31 | Optimizing the network | Advait, Yuying | ||
4 | February 7 | Tensorboard, TSNE, Visualizing network parameters and outputs at every layer | Soumya, Yash | ||
5 | February 14 | CNN: Basics | Hao, Zhefan | ||
6 | February 21 | CNN: Losses, transfer learning | Rohit, Bhuvan | ||
7 | February 28 | RNN: Basics | Advait, Chris | ||
8 | March 6 | CTC | Chris, Soumya | ||
9 | March 20 | Attention | Yang, Yuying | ||
10 | March 27 | VAEs | Yash, Hao | ||
11 | April 3 | Listen Attend Spell | Rohit, Amala | ||
12 | April 10 | Generative Adversarial Networks (GANs) | Hao, Yash | ||
13 | April 17 | Reinforcement Learning | Zhefan, Bhuvan | ||
14 | April 24 | Hopfield nets / Boltzmann machines | Rohit, Yang |
Number | Part | Topics | Release Date | Early-submission Deadline | On-time Deadline | Links |
---|---|---|---|---|---|---|
HW0 | — | January 5 | January 19 |
Handout (*.targ.gz) |
||
HW1 | P1 | MLP pytorch | January 19 | February 8 |
Handout (*.tar) Writeup (*.pdf) |
|
P1-bonus | Dropout, ADAM in pytorch | January 19 | ||||
P2 | MLP, phoneme recognition | January 19 | January 29 | February 8 | Writeup (*.pdf) | |
HW2 | P1 | CNN as scanning MLP, backprop | February 9 | |||
P1-bonus | CNN: conv1d/pooling/forward/backward | February 9 | ||||
P2 | Face Recognition: Classification and Verification | February 9 | ||||
HW3 | P1 | RNN: forward/backward/CTC beam search | March 8 | |||
P1-bonus | Full BPTT, Full BPTT with forward backward | March 8 | ||||
P2 | Connectionist Temporal Classification | March 8 | ||||
HW4 | P1 | Word-Level Neural Language Models | April 5 | |||
P1-bonus | TBD | April 5 | ||||
P2 | Attention Mechanisms and Memory Networks | April 5 |
Assignment | Deadline | Description | Links |
---|---|---|---|
Team Formation | TBD | Teams will be formed in groups of four each *If you do not have a team after this point, you will be grouped randomly |
|
Project Proposal | TBD | Project Description Guidelines | |
Midterm Report | TBD | report template is provided to detail your initial experiments | |
Poster Presentation | 8:00pm, May 5th and 6th | It will be a final poster session of the different groups in all three campuses | |
Final Project Report | TBD | This should be the final document for the course project |