Welcome to my Deep Learning repository!
This is a collection of notebooks, experiments, and mini-projects I created while learning and practicing Deep Learning concepts — from the fundamentals to more advanced topics like CNNs ,GANs ,AE ,RNNs ,Transformers , and Transfer Learning.
This repository documents my learning path in Deep Learning.
It includes experiments with different architectures, frameworks, and datasets — all focused on understanding how neural networks learn, generalize, and perform on real-world data.
Deep-learning/
│
├── 📂 models/ # Custom-built model architectures
├── 📂 notebooks/ # Jupyter notebooks covering deep learning topics
├── 📂 utils/ # Helper functions (data loaders, visualization, metrics)
└── README.md
- Python 3
- TensorFlow / Keras
- PyTorch
- NumPy, Pandas, Matplotlib, Seaborn
- Jupyter Notebook
Through this repository, my main objectives were to:
- Master PyTorch by building and training models from scratch
- Understand the core theory and intuition behind neural networks
- Gain hands-on experience with different architectures (CNNs, RNNs, Transformers)
- Learn how to debug, tune, and visualize training progress
- Apply transfer learning and model optimization in practice
- 🖼️ Image Classification using CNNs (MNIST, CIFAR-10)
- 🕒 Sequence Prediction using RNN/LSTM
- 🔍 Transfer Learning with pre-trained models (VGG16, ResNet)
- 📊 Visualization of Training Metrics
During my learning, I followed materials from:
- Deep Learning Specialization by Andrew Ng (Coursera)
- A deep understanding of deep learning by udemy
- DataCamp courses.
- Deep Learning Foundations and Concepts book by Christopher M. Bishop.
This repository reflects my personal learning process —
but if you spot improvements, suggestions, or better approaches, feel free to open an issue or pull request!
If you find this repository helpful or inspiring,
please ⭐️ star this repo — it helps others discover it too!
Made with ❤️ and curiosity by Mohamed Diaa Zellagui
"The best way to learn is by building and experimenting."