Swayam Central

Deep Learning - Part 1

By Prof. Sudarshan Iyengar, Prof. Padmavati   |   IIT Ropar
Deep Learning has received a lot of attention over the past few years and has been employed successfully by companies like Google, Microsoft, IBM, Facebook, Twitter etc. to solve a wide range of problems in Computer Vision and Natural Language Processing. In this course we will learn about the building blocks used in these Deep Learning based solutions. Specifically, we will learn about feedforward neural networks, convolutional neural networks, recurrent neural networks and attention mechanisms. We will also look at various optimization algorithms such as Gradient Descent, Nesterov Accelerated Gradient Descent, Adam, AdaGrad and RMSProp which are used for training such deep neural networks. At the end of this course students would have knowledge of deep architectures used for solving various Vision and NLP tasks

INTENDED AUDIENCE: Any Interested Learners
PREREQUISITES:         Working knowledge of Linear Algebra, Probability Theory. It would be beneficial if the participants have
                                       done a course on Machine Learning.

Learners enrolled: 6556


Course Status : Ongoing
Course Type : Elective
Duration : 12 weeks
Start Date : 27 Jan 2020
End Date : 17 Apr 2020
Exam Date : 26 Apr 2020
Enrollment Ends : 03 Feb 2020
Category :
  • Computer Science and Engineering
  • Level : Undergraduate/Postgraduate
    This is an AICTE approved FDP course


    Week 1 :  (Partial) History of Deep Learning, Deep Learning Success Stories, McCulloch Pitts Neuron, Thresholding Logic, 
                     Perceptrons, Perceptron Learning Algorithm
    Week 2 :  Multilayer Perceptrons (MLPs), Representation Power of MLPs, Sigmoid Neurons, Gradient Descent, Feedforward
                     Neural Networks, Representation Power of Feedforward Neural Networks
    Week 3 :  FeedForward Neural Networks, Backpropagation
    Week 4 :  Gradient Descent (GD), Momentum Based GD, Nesterov Accelerated GD, Stochastic GD, AdaGrad, RMSProp, Adam,
                     Eigenvalues and eigenvectors, Eigenvalue Decomposition, Basis
    Week 5 :  Principal Component Analysis and its interpretations, Singular Value Decomposition
    Week 6 :  Autoencoders and relation to PCA, Regularization in autoencoders, Denoising autoencoders, Sparse autoencoders,
                    Contractive autoencoders
    Week 7 :  Regularization: Bias Variance Tradeoff, L2 regularization, Early stopping, Dataset augmentation, Parameter sharing
                    and tying, Injecting noise at input, Ensemble methods, Dropout
    Week 8 :  Greedy Layerwise Pre-training, Better activation functions, Better weight initialization methods, Batch Normalization
    Week 9 :  Learning Vectorial Representations Of Words
    Week 10: Convolutional Neural Networks, LeNet, AlexNet, ZF-Net, VGGNet, GoogLeNet, ResNet, Visualizing Convolutional
                    Neural Networks, Guided Backpropagation, Deep Dream, Deep Art, Fooling Convolutional Neural Networks
    Week 11: Recurrent Neural Networks, Backpropagation through time (BPTT), Vanishing and Exploding Gradients, Truncated BPTT, GRU, LSTMs
    Week 12: Encoder Decoder Models, Attention Mechanism, Attention over images


    Deep Learning, An MIT Press book, Ian Goodfellow and Yoshua Bengio and Aaron Courville http://www.deeplearningbook.org


    Prof. Sudarshan Iyengar

    IIT Ropar
    Sudarshan Iyengar has a PhD from the Indian Institute of Science and is currently working as an assistant professor at IIT Ropar and has been teaching this course from the past 4 years.

    Prof. Padmavati

    Dr. Padmavati received B.E degree in Computer Science & Engineering with distinction from PDACE, Gulbarga, V.T.U, Belgaum, Karnataka, M.E. degree in Computer Science & Engineering and Ph.D from Panjab Engineering College (Deemed to be University), Chandigarh. Currently, she is working as Assistant Professor at Panjab Engineering College (Deemed to be University), Chandigarh, India. She has been teaching this course since two years for M.Tech students. She has also offered courses like Data structure, Analysis and design of algorithms, Object oriented programming, Research methodology, and Wireless sensor networks. Her research interests are in the areas of Wireless sensor networks, IoT, machine learning and deep learning. Her current research projects include “Classification of Parkinson’s disease using machine learning algorithms”, “Major depressive disorder using EEG Signal”, and “Detection of neurological disorder - Epilepsy using EEG Signals”.


    • The course is free to enroll and learn from. But if you want a certificate, you have to register and write the proctored exam conducted by us in person at any of the designated exam centres.
    • The exam is optional for a fee of Rs 1000/- (Rupees one thousand only).
    • Date and Time of Exams: 26th April 2020, Morning session 9am to 12 noon; Afternoon Session 2pm to 5pm.
    • Registration url: Announcements will be made when the registration form is open for registrations.
    • The online registration form has to be filled and the certification exam fee needs to be paid. More details will be made available when the exam registration form is published. If there are any changes, it will be mentioned then.
    • Please check the form for more details on the cities where the exams will be held, the conditions you agree to when you fill the form etc.

    • Average assignment score = 25% of average of best 8 assignments out of the total 12 assignments given in the course. 
    • Exam score = 75% of the proctored certification exam score out of 100
    • Final score = Average assignment score + Exam score

    • If one of the 2 criteria is not met, you will not get the certificate even if the Final score >= 40/100.
    • Certificate will have your name, photograph and the score in the final exam with the breakup.It will have the logos of NPTEL and IIT Madras. It will be e-verifiable at nptel.ac.in/noc.
    • Only the e-certificate will be made available. Hard copies will not be dispatched.