Skip to content

Latest commit

 

History

History
362 lines (258 loc) · 28.4 KB

README.md

File metadata and controls

362 lines (258 loc) · 28.4 KB

Machine Learning from Scratch

This repository contains examples of popular machine learning algorithms implemented in Python with mathematics behind them being explained. Each algorithm has interactive Jupyter Notebook demo that allows you to play with training data, algorithms configurations and immediately see the results, charts and predictions right in your browser.

The purpose of this repository is not to implement machine learning algorithms by using 3rd party library one-liners but rather to practice implementing these algorithms from scratch and get better understanding of the mathematics behind each algorithm.

Table of Contents

Machine Learning

Supervised Learning

In supervised learning we have a set of training data as an input and a set of labels or correct answers for each training set as an output. Then we're training our model (machine learning algorithm parameters) to map the input to the output correctly (to do correct prediction). The ultimate purpose is to find such model parameters that will successfully continue correct input→output mapping (predictions) even for new input examples.

Regression

In regression problems we do real value predictions. Basically we try to draw a line/plane/n-dimensional plane along the training examples. In regression we deal with continuos as well as decreate data

🤖 Lasso/Ridge Regression

🤖 Support Vector Machines

SVM construct a hyper plane in high dimension which can be used for Classification , regression , or outlier detection.

🤖 K nearest neighbors

  • KNN (K — Nearest Neighbors) is one of many (supervised learning) algorithms used in data mining and machine learning, it’s a classifier algorithm where the learning is based “how similar” is a data (a vector) from other

Classification

In classification problems we split input examples by certain characteristic.

_Usage examples: benign-malignent-data, wine-quality, MNIST handwritten.

🤖 Logistic Regression

🤖 Naive Bayes

🤖 Decision Tree

Ensemble Method

Ensemble learning is a machine learning paradigm where multiple models (often called “weak learners”) are trained to solve the same problem and combined to get better results. The main hypothesis is that when weak models are correctly combined we can obtain more accurate and/or robust models.

Unsupervised Learning

Unsupervised learning is a branch of machine learning that learns from test data that has not been labeled, classified or categorized. Instead of responding to feedback, unsupervised learning identifies commonalities in the data and reacts based on the presence or absence of such commonalities in each new piece of data.

Dimentional Reduction

In dimentional reduction we select 'K' features from given 'n' features by using some techniques.

🤖 Principal Component Analysis

🤖 Non Negative Matrix Factorization

🤖 Singular Value Decomposition

Clustering

Clustering is the task of dividing the population or data points into a number of groups such that data points in the same groups are more similar to other data points in the same group and dissimilar to the data points in other groups.

🤖 K-Mean

🤖 DBSCAN

Deep Learning

🤖 Perceptron

Perceptron is similar to SVM it also construct a hyper plane in high dimension if data is linearly seperable.

🤖 Artificial Neural Network

🤖 Convolutional Neural Network

🤖 Recurrent Neural Network

A recurrent neural network (RNN) is a class of artificial neural networks where connections between nodes form a directed graph along a temporal sequence. This allows it to exhibit temporal dynamic behavior.

  • 📗 Theory | RNN - theory and explanation

  • ▶️ Demo | RNN - Vanilla RNN for Single-Batch from scratch

  • ▶️ Demo | RNN - Vanilla RNN for Multi-Batch from scratch

  • 📗 Theory | RNN - Derivation of Back Propagation through Time(BPTT).

Optimization Algorithms

🤖 Gradient Decent

🤖 Gradient Decent Check

🤖 Gradient Decent with Mini Batch

🤖 Gradient Decent with Adam Optimization

🤖 Gradient Decent with Momentum Optimization

🤖 Gradient Decent with RMSProp Optimization

🤖 Newton's Raphson Method

Paper Implement

Deep Learning with Tensorflow

Regression

Classification

Complex Modelling using Functional API

Tensorboard

Hyperparameter Fine Tuning

Tensor and Operations

Custom Model Building

Loading and Preprocessing Large Data

CNN with Tensorflow

Sequential Modelling

Character Level Modelling

Stateless RNN

Stateful RNN

Word Level Modelling

Sentiment Analysis

Encoder-Decoder

BiDirectional Layer

Beam Search

Attention

Transformers Multi Head Attention

NLP with HuggingFace and Transformers

UnderComplete Linear AutoEncoder

Stacked AutoEncoder

Convolutional AutoEncoder

Recurrent AutoEncoder

Denoising AutoEncoder

Sparse AutoEncoder

Variational AutoEncoder

Generative Adversarial Networks

Deep Convolutional GAN

Hasing using Binary AutoEncoder

Denoising AutoEncoder 3 Channel Image

Model Deployment

Topic Modelling

Prerequisites

Installing Python

Make sure that you have Python installed on your machine.

You might want to use venv standard Python library to create virtual environments and have Python, pip and all dependent packages to be installed and served from the local project directory to avoid messing with system-wide packages and their versions.

Installing Dependencies

Install all dependencies that are required for the project by running:

Datasets

The list of datasets that are being used for Jupyter Notebook demos may be found in DataSet Folder.

Clone

git clone https://github.com/Girrajjangid/Machine-Learning-from-Scratch.git