Skip to content

juaml/Basics_of_Deep_Learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

2 Commits
Β 
Β 
Β 
Β 

Repository files navigation

πŸš€ Basics of Deep Learning

Welcome to the "General Deep Learning Learning" course! πŸŽ‰

πŸ“… Course Details

  • Dates: October. To be confirmed.
  • Time: Wednesdays, 14:00 – 16:00
  • Location: Virtual (Zoom link provided upon registration)
  • Prerequisites: Basic Python & statistics and linear algebra knowledge recommended

🧠 What You'll Learn

This course covers core Deep Learning concepts and practical techniques to build robust models.


πŸ“š Detailed Content

Basics of Neural Networks (NNs)

  • Learning Objective and Loss Functions
  • Optimization Problem Setup
  • Gradients as Learning Signals
  • Computational Graphs
  • Automatic Differentiation (Backprop intuition)
  • Basic Gradient Descent
  • Integration: First Neural Network

Training Deep NNs

  • Scaling the Network
  • Optimization in Practice
  • Failure Modes in Deep Networks (Vanishing/ exploding gradients, Saturating activations)
  • Activation Functions
  • Initialization
  • Normalization
  • Regularization and Generalization
  • Diagnosing
  • Integration: A Working Deep MLP

Convolutional NN and Computer Vision

  • How to make a network see?
  • Inductive biases, Convolution, and pooling
  • Space invariance
  • Padding, strides, and dilation

Representation Learning

  • Analyzing SVM weights, detecting biases, and SHAP/LIME explanations

Advance Deep Learning topics

  • Attention
  • Transformers
  • Foundation models
  • AGI
  • Fairness

πŸ”— Resources

πŸ”— Full Program: [Google Doc] πŸ“‚ Slides & Materials: [Sciebo]

πŸ‘¨β€πŸ’»πŸ‘©β€πŸ’» See you in class! Happy learning! πŸ‘¨β€πŸ’»πŸ‘©β€πŸ’»

About

Repository with resources and examples for the Basics of Deep Learning course at HHU.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors