Welcome to the "General Deep Learning Learning" course! π
- Dates: October. To be confirmed.
- Time: Wednesdays, 14:00 β 16:00
- Location: Virtual (Zoom link provided upon registration)
- Prerequisites: Basic Python & statistics and linear algebra knowledge recommended
This course covers core Deep Learning concepts and practical techniques to build robust models.
- Learning Objective and Loss Functions
- Optimization Problem Setup
- Gradients as Learning Signals
- Computational Graphs
- Automatic Differentiation (Backprop intuition)
- Basic Gradient Descent
- Integration: First Neural Network
- Scaling the Network
- Optimization in Practice
- Failure Modes in Deep Networks (Vanishing/ exploding gradients, Saturating activations)
- Activation Functions
- Initialization
- Normalization
- Regularization and Generalization
- Diagnosing
- Integration: A Working Deep MLP
- How to make a network see?
- Inductive biases, Convolution, and pooling
- Space invariance
- Padding, strides, and dilation
- Analyzing SVM weights, detecting biases, and SHAP/LIME explanations
- Attention
- Transformers
- Foundation models
- AGI
- Fairness
π¨βπ»π©βπ» See you in class! Happy learning! π¨βπ»π©βπ»