This repository contains implementations of all Deep Learning Algorithms from scratch in Python. Mathematics required for DL and many projects have also been included. It also has practical tutorials on Generative AI.
✅ What is Deep Learning? Deep Learning Vs Machine Learning
✅ Types of Neural Networks
✅ What is a Perceptron? | Perceptron Geometric Intuition
✅ How to train a Perceptron?
✅ Perceptron Loss Function | Sigmoid Function
✅ Problem with Perceptron
MLP Notation
Multi Layer Perceptron | MLP Intuition
Forward Propagation | How a neural network predicts output?
Customer Churn Prediction using ANN | Deep Learning Classification
Handwritten Digit Classification using ANN
Graduate Admission Prediction using ANN
Loss Functions in Deep Learning
Backpropagation in Deep Learning
MLP Memoization
Gradient Descent in Neural Networks
Vanishing Gradient Problem in ANN
How to Improve the Performance of a Neural Network
Early Stopping In Neural Networks
Data Scaling in Neural Network
Dropout Layer in Deep Learning
Dropout Layers in ANN
Regularization in Deep Learning
Activation Functions in Deep Learning
Relu Variants Explained
Weight Initialization Techniques
Xavier/Glorat And He Weight Initialization in Deep Learning
Batch Normalization in Deep Learning
Optimizers in Deep Learning
Exponentially Weighted Moving Average or Exponential Weighted Average
SGD with Momentum
Nesterov Accelerated Gradient (NAG)
AdaGrad
RMSProp
Adam Optimizer
Keras Tuner | Hyperparameter Tuning a Neural Network
What is Convolutional Neural Network (CNN)
CNN Vs Visual Cortex | History of CNN
CNN
Padding & Strides in CNN
Pooling Layer in CNN
CNN Architecture
Comparing CNN Vs ANN
Backpropagation in CNN
Cat Vs Dog Image Classification Project
Data Augmentation in Deep Learning
Pretrained models in CNN
What does a CNN see?
What is Transfer Learning?
Keras Functional Model
Why RNNs are needed
Recurrent Neural Network
RNN Sentiment Analysis
Types of RNN
How Backpropagation works in RNN
Problems with RNN
LSTM
LSTM Architecture
LSTM
Gated Recurrent Unit
Deep RNNs
Bidirectional RNN
The Epic History of Large Language Models (LLMs)
Encoder Decoder
Attention Mechanism in 1 video
Bahdanau Attention Vs Luong Attention
Introduction to Transformers
What is Self Attention
Self Attention in Transformers
Scaled Dot Product Attention
Self Attention Geometric Intuition
Why is Self Attention called "Self"?
What is Multi-head Attention in Transformers
Where GenAI exists?
Difference between discriminative and generative models
What is LLM?
LLM model architecture types and application
Introduction of OpenAI and complete walkthrough
OpenAI api setup with python
Complete Discussion on Vector Database - ChromaDB, Pinecone & Weaviate
Open Source LLM models - Learn Meta Llama 2
How to use Google PaLM 2 - Open Source LLM model
How to use Falcon Open Source LLM model & Fine-Tuning LLMs
End-to-end Generative AI Project with LangChain, LLMs, VectorDB & Streamlit
How to build Generative AI Application using LlamaIndex - An alternative framework of LangChain
How to Build LLM Apps Super Fast with Chainlit
End-to-end Generative AI Project with LlamaIndex