Course Description: This course reviews modern methods used in deep learning and neural network design from a practical perspective. Both a broad set of techniques that are commonly used in state-of-the-art neural network architectures and network styles prevalent in specific sub-domains like computer vision, natural language processing, and social network analysis are discussed. Students learn how to use these techniques in modern frameworks and how to apply these methods to new problems.
Prerequisites: DATA 602 – Introduction to Data Analysis and Machine Learning or CMSC 478/678 – Introduction to Machine Learning
Course Academic Objectives
• To introduce students to the basic concepts and techniques of modern neural networks.
• To develop skills in using recent deep learning methods and tools for solving practical problems.
• To teach how to design neural networks for domain-specific applications.
• To inform students about open issues in deep learning.
Course Requirements: The course will be using Python 3 with the following libraries: numpy, sklearn, pandas, matplotlib, Jupyter, pyTorch. It is the student’s responsibility to have a working environment. If you’d like to have the environment installed locally Anaconda is a Python distribution that has all required libraries. The recommended option is to use Goolge’s Colab which is available from your UMBC account. Both options are a search away.
Course Outline
Week 1: Introduction to PyTorch & Foundations
Key Concepts: Automatic Differentiation, PyTorch mechanics
Week 2: Neural Network Review
Loss Function, Activation Functions, Regularization, Loss Functions
Week 3: Convolutional Neural Networks (CNNs)
Key Concepts: Pooling, Weight Sharing, Stride, Dilation
Application: Image Classification
Week 4: Recurrent Neural Networks (RNNs)
Key Concepts: Sequence Prediction Problems, Gradient Clipping
Application: Sentiment Classification & Language Models
Week 5: Network Design: Common Design Building-Blocks
Key Concepts: Batch-Normalization, Skip-Connections, 1×1 Convolution
Week 6: Modern Training Techniques
Key Concepts: Momentum, Hyperparameter Optimization, dropout, learning-rate annealing
Week 7: Network Design: Embedding & Auto-Encoding
Key Concepts: PCA, Self-Supervision
Application: Image Denoising & Information Retrieval
Week 8: Region Proposal Networks
Key Concepts: Mask R-CNN
Applications: Object Detection.
Week 9: Generative Adversarial Networks (GANs)
Key Concepts: Min-Max games, Generative Models, Ethics
Applications: Photo Manipulation, Super-Resolution
Week 10: Network Design: Attention Mechanisms:
Key Concepts: Attention, encoding priors
Application: Machine Translation
Week 11: Network Design: Alternatives to RNNs
Key Concepts: Temporal Pooling, Transformers
Week 12: Transfer Learning
Key concepts: Pre-training, Fine-tuning & weight freezing, domain adaption
Application: Object Detection, NLP
Week 13: Network Design 2: Advanced Design Building Blocks & Training Techniques
Key Concepts: Scale-Adaption, Better Pooling, mixup
Week 14: Review
Week 15: Project Demonstrations
Required Text
• Edward Raff, “Inside Deep Learning: Math, Algorithms, Models.” Publisher: Manning Inc., Dec., 2020. ISBN 9781617298639
References
• Eli Stevens, Luca Antiga, and Thomas Viehmann, “Deep Learning by PyTorch.” Publisher: Manning Inc., 2020.
• Vishnu Subramaniam, “Deep Learning with PyTorch: A practical approach to building neural network models using PyTorch.” Publisher : Packt Publishing, Feb., 2018. ISBN-10: 1788624335
• Ian Pointer, “Programming PyTorch for Deep Learning: Creating and Deploying Deep Learning Applications.” Publisher: O’Reilly, Oct. 2020. ISBN-10 : 1492045357