X

Deep Learning Part 1 (IITM)

By Prof. Sudarshan Iyengar & Prof.Mitesh M. Khapra   |   IIT Madras
Learners enrolled: 6001
Deep Learning has received a lot of attention over the past few years and has been employed successfully by companies like Google, Microsoft, IBM, Facebook, Twitter etc. to solve a wide range of problems in Computer Vision and Natural Language Processing. In this course we will learn about the building blocks used in these Deep Learning based solutions. Specifically, we will learn about feedforward neural networks, convolutional neural networks, recurrent neural networks and attention mechanisms. We will also look at various optimization algorithms such as Gradient Descent, Nesterov Accelerated Gradient Descent, Adam, AdaGrad and RMSProp which are used for training such deep neural networks. At the end of this course students would have knowledge of deep architectures used for solving various Vision and NLP tasks

INTENDED AUDIENCE: Any Interested Learners
PREREQUISITES:         Working knowledge of Linear Algebra, Probability Theory. It would be beneficial if the participants have
                                       done a course on Machine Learning.

Summary
Course Status : Completed
Course Type : Elective
Duration : 12 weeks
Category :
  • Computer Science and Engineering
Credit Points : 3
Level : Undergraduate
Start Date : 29 Jul 2019
End Date : 18 Oct 2019
Exam Date : 16 Nov 2019 IST

Note: This exam date is subjected to change based on seat availability. You can check final exam date on your hall ticket.


Page Visits



Course layout

Week 1 :  (Partial) History of Deep Learning, Deep Learning Success Stories, McCulloch Pitts Neuron, Thresholding Logic, 
                 Perceptrons, Perceptron Learning Algorithm
Week 2 :  Multilayer Perceptrons (MLPs), Representation Power of MLPs, Sigmoid Neurons, Gradient Descent, Feedforward
                 Neural Networks, Representation Power of Feedforward Neural Networks
Week 3 :  FeedForward Neural Networks, Backpropagation
Week 4 :  Gradient Descent (GD), Momentum Based GD, Nesterov Accelerated GD, Stochastic GD, AdaGrad, RMSProp, Adam,
                 Eigenvalues and eigenvectors, Eigenvalue Decomposition, Basis
Week 5 :  Principal Component Analysis and its interpretations, Singular Value Decomposition
Week 6 :  Autoencoders and relation to PCA, Regularization in autoencoders, Denoising autoencoders, Sparse autoencoders,
                Contractive autoencoders
Week 7 :  Regularization: Bias Variance Tradeoff, L2 regularization, Early stopping, Dataset augmentation, Parameter sharing
                and tying, Injecting noise at input, Ensemble methods, Dropout
Week 8 :  Greedy Layerwise Pre-training, Better activation functions, Better weight initialization methods, Batch Normalization
Week 9 :  Learning Vectorial Representations Of Words
Week 10: Convolutional Neural Networks, LeNet, AlexNet, ZF-Net, VGGNet, GoogLeNet, ResNet, Visualizing Convolutional
                Neural Networks, Guided Backpropagation, Deep Dream, Deep Art, Fooling Convolutional Neural Networks
Week 11: Recurrent Neural Networks, Backpropagation through time (BPTT), Vanishing and Exploding Gradients, Truncated BPTT, GRU, LSTMs
Week 12: Encoder Decoder Models, Attention Mechanism, Attention over images

Books and references

Deep Learning, An MIT Press book, Ian Goodfellow and Yoshua Bengio and Aaron Courville http://www.deeplearningbook.org

Instructor bio


Prof. Sudarshan Iyengar
 has a Ph.D. from the Indian Institute of Science and is currently working as an assistant professor at IIT Ropar and has been teaching this course from the past 5 years. Apart from this course, he has offered several other courses in IIT Ropar like Discrete Mathematics, Theory of Computation, Cryptography, Probability and Computing etc.
 His research interests include social networks, crowdscoured knowledge building and computational social sciences. His current research proects are "Predicting a Viral meme" (Yayati Gupta), "Understanding Crowdsourced Knowledge buidling" (Anamika Chhabra - Scientist), "Secure Computation" (Varsha Bhat) and "Network Sampling" (Akrati Saxena). 




Prof.Mitesh M. Khapra is an Assistant Professor in the Department of Computer Science and Engineering at IIT Madras. While at IIT Madras he plans to pursue his interests in the areas of Deep Learning, Multimodal Multilingual Processing, Dialog systems and Question Answering. Prior to that he worked as a Researcher at IBM Research India. During the four and half years that he spent at IBM he worked on several interesting problems in the areas of Statistical Machine Translation, Cross Language Learning, Multimodal Learning, Argument Mining and Deep Learning. This work led to publications in top conferences in the areas of Computational Linguistics and Machine Learning. Prior to IBM, he completed his PhD and M.Tech from IIT Bombay in Jan 2012 and July 2008 respectively. His PhD thesis dealt with the important problem of reusing resources for multilingual computation. During his PhD he was a recipient of the IBM PhD Fellowship (2011) and the Microsoft Rising Star Award (2011). He is also a recipient of the Google Faculty Research Award (2017).

Course certificate

  • The course is free to enroll and learn from. But if you want a certificate, you have to register and write the proctored exam conducted by us in person at any of the designated exam centres.
  • The exam is optional for a fee of Rs 1000/- (Rupees one thousand only).
  • Date and Time of Exams: 16 November 2019, Morning session 9am to 12 noon; Afternoon Session 2pm to 5pm.
  • Registration url: Announcements will be made when the registration form is open for registrations.
  • The online registration form has to be filled and the certification exam fee needs to be paid. More details will be made available when the exam registration form is published. If there are any changes, it will be mentioned then.
  • Please check the form for more details on the cities where the exams will be held, the conditions you agree to when you fill the form etc.

CRITERIA TO GET A CERTIFICATE
  • Average assignment score = 25% of average of best 8 assignments out of the total 12 assignments given in the course. 
  • Exam score = 75% of the proctored certification exam score out of 100
  • Final score = Average assignment score + Exam score

YOU WILL BE ELIGIBLE FOR A CERTIFICATE ONLY IF AVERAGE ASSIGNMENT SCORE >=10/25 AND EXAM SCORE >= 30/75
  • If one of the 2 criteria is not met, you will not get the certificate even if the Final score >= 40/100.
  • Certificate will have your name, photograph and the score in the final exam with the breakup.It will have the logos of NPTEL and IIT Kharagpur. It will be e-verifiable at nptel.ac.in/noc.
  • Only the e-certificate will be made available. Hard copies are being discontinued from July 2019 semester and will not be dispatched


MHRD logo Swayam logo

DOWNLOAD APP

Goto google play store

FOLLOW US