Links to Frameworks, Libraries, Datasets, White papers, articles, list of books, algorithms, tutorials, diagrams, models, code examples, videos, glossaries, and other topics related to Projects in Deep Learning, Machine Learning and AI.
source (https://www.prowesscorp.com/wp-content/uploads/2017/06/inBlog.png)
Simple step-by-step walkthroughs to solve common machine learning problems using best practices
With new neural network architectures popping up every now and then, it’s hard to keep track of them all. Knowing all the abbreviations being thrown around (DCIGN, BiLSTM, DCGAN, anyone?) can be a bit overwhelming at first. Composing a complete list is practically impossible, as new architectures are invented all the time.
source (https://github.com/trekhleb/homemade-machine-learning)
source (http://www.asimovinstitute.org/neural-network-zoo/)
- Kohonen networks (KN), also self organising (feature) map, SOM, SOFM) - Kohonen, Teuvo. “Self-organized formation of topologically correct feature maps.” Biological cybernetics 43.1 (1982)
- Support vector machines (SVM) - Cortes, Corinna, and Vladimir Vapnik. “Support-vector networks.” Machine learning 20.3 (1995): 273-297.
- Liquid state machines (LSM) - Maass, Wolfgang, Thomas Natschläger, and Henry Markram. “Real-time computing without stable states: A new framework for neural computation based on perturbations (Wayback Machine) - Neural computation 14.11 (2002): 2531-2560.
- Extreme learning machines (ELM) - Cambria, Erik, et al. “Extreme learning machines [trends & controversies].” IEEE Intelligent Systems 28.6 (2013): 30-59.
- Echo state networks (ESN) - Jaeger, Herbert, and Harald Haas. “Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication.” science 304.5667 (2004): 78-80.
- *Deep residual networks (DRN) * - He, Kaiming, et al. “Deep residual learning for image recognition.” arXiv preprint arXiv:1512.03385 (2015).
- Bidirectional recurrent neural networks, bidirectional long & short term memory networks and bidirectional gated recurrent units (BiRNN, BiLSTM and BiGRU respectively) - Schuster, Mike, and Kuldip K. Paliwal. “Bidirectional recurrent neural networks.” IEEE Transactions on Signal Processing 45.11 (1997): 2673-2681.
- Neural Turing machines (NTM) - Graves, Alex, Greg Wayne, and Ivo Danihelka. “Neural turing machines.” arXiv preprint arXiv:1410.5401 (2014).
- Gated recurrent units (GRU) - Chung, Junyoung, et al. “Empirical evaluation of gated recurrent neural networks on sequence modeling.” arXiv preprint arXiv:1412.3555 (2014).
- Long/short term memory (LSTM) networks - Hochreiter, Sepp, and Jürgen Schmidhuber. “Long short-term memory.” Neural computation 9.8 (1997): 1735-1780.
- Recurrent neural networks (RNN) - Elman, Jeffrey L. “Finding structure in time.” Cognitive science 14.2 (1990): 179-211.
- Generative adversarial networks (GAN) - Goodfellow, Ian, et al. “Generative adversarial nets.” Advances in Neural Information Processing Systems. 2014.
- *Deep convolutional inverse graphics networks (DCIGN) * - Kulkarni, Tejas D., et al. “Deep convolutional inverse graphics network.” Advances in Neural Information Processing Systems. 2015.
- Deconvolutional networks (DN) - Zeiler, Matthew D., et al. “Deconvolutional networks.” Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on. IEEE, 2010.
- Convolutional neural networks (CNN or deep convolutional neural networks, DCNN) - LeCun, Yann, et al. “Gradient-based learning applied to document recognition.” Proceedings of the IEEE 86.11 (1998): 2278-2324.
- Deep belief networks (DBN) -Bengio, Yoshua, et al. “Greedy layer-wise training of deep networks.” Advances in neural information processing systems 19 (2007): 153.
- Denoising autoencoders (DAE) - Vincent, Pascal, et al. “Extracting and composing robust features with denoising autoencoders.” Proceedings of the 25th international conference on Machine learning. ACM, 2008.
- *Variational autoencoders (VAE) * - Kingma, Diederik P., and Max Welling. “Auto-encoding variational bayes.” arXiv preprint arXiv:1312.6114 (2013).
- *Sparse autoencoders (SAE) * - Marc’Aurelio Ranzato, Christopher Poultney, Sumit Chopra, and Yann LeCun. “Efficient learning of sparse representations with an energy-based model.” Proceedings of NIPS. 2007.
- Autoencoders (AE) - Bourlard, Hervé, and Yves Kamp. “Auto-association by multilayer perceptrons and singular value decomposition.” Biological cybernetics 59.4-5 (1988): 291-294.
- Restricted Boltzmann machines (RBM) - Smolensky, Paul. Information processing in dynamical systems: Foundations of harmony theory. No. CU-CS-321-86. COLORADO UNIV AT BOULDER DEPT OF COMPUTER SCIENCE, 1986.
- *Boltzmann machines (BM) * - Hinton, Geoffrey E., and Terrence J. Sejnowski. “Learning and releaming in Boltzmann machines.” Parallel distributed processing: Explorations in the microstructure of cognition 1 (1986): 282-317.
- Markov chains (MC or discrete time Markov Chain, DTMC) - Hayes, Brian. “First links in the Markov chain.” American Scientist 101.2 (2013): 252.
- Hopfield network (HN) - Hopfield, John J. “Neural networks and physical systems with emergent collective computational abilities.” Proceedings of the national academy of sciences 79.8 (1982): 2554-2558.
- *Radial basis function (RBF) * - Broomhead, David S., and David Lowe. Radial basis functions, multi-variable functional interpolation and adaptive networks. No. RSRE-MEMO-4148. ROYAL SIGNALS AND RADAR ESTABLISHMENT MALVERN (UNITED KINGDOM), 1988.
- *Feed forward neural networks (FF or FFNN) and perceptrons (P) * - Rosenblatt, Frank. “The perceptron: a probabilistic model for information storage and organization in the brain.” Psychological review 65.6 (1958): 386.
Repository of Technical Papers Published about AI, Machine Learning ad Deep Neural Networks - Cornell University
- Adaptive Resonance Theory
- Artificial Intelligence
- Artificial Neural Network
- Associate Memory Network
- Autoencoders
- Backpropagation Neural Networks
- Bag of Words(Classification Method)
- Bi-directional Neural Network
- Bigram
- Binary Associative Memory
- Biological Neural Network
- Cascade correlation
- Clustering
- Competitive Learning
- Compositional pattern-producing network
- Convolutional Neural Networks
- Decoders
- Deep Feedforward and Recurrent Neural Networks
- Deep Neural Networks -Delta Rule
- Elman Nets
- Ensemble Learning
- Facial Emotion Recognition Systems
- Feedback Network
- Feedforward - Autoencoders
- Feedforward - Probabilistic
- Feedforward - Time Delay
- Feedforward Neural Networks
- Fully recurrent neural network
- Generative Adversarial Networks
- Generative Perceptrons
- Genetic Algorithm
- Gradient Descent Technique
- Gradient Based Methods
- Hamming Network
- Hebbian learning
- Hierarchical Neural Network
- Jordan Networks
- K-means Clustering Algorithm
- Kohonen Self Organizing -Learning Vector Quantization
- Long Short Term Memory
- Machine Learning
- Modular Neural Network
- Multi-layer feed-forward
- Multilayer Perceptrons
- Neighboring topology
- Neocognitron
- Neuro-fuzzy
- Optical neural network
- Papers about Classification
- Radial basis function Neural Network
- Radial basis function
- Recurrent Neural Networks
- recurrent convolutional neural networks
- Regulatory feedback
- Reinforcement Learning
- Simulated Annealing
- Single-layer feed-forward
- Spiking Neural Network
- Stochastic Neural Network
- Supervised Learning
- Support Vector Machines
- Temporal Convolutional Networks
- Time delay neural network
- Training Algorithms
- * Tree Kernels*
- Unfolding recurrent neural networks
- Unsupervised learning
Papers About Areas of Application of Artificial Intelligence and Machine Learning - Cornell University
- Automatic Coloration
- Anomaly Detection
- Automatic Feature Extraction
- Automatic Machine Translation
- Automatic Text Generation
- Bayesian Learning
- Character Recognition
- Clustering Problems
- Computer Vision
- Contextual Learning
- Coreference resolution
- Denoising
- Density Estimation and Clustering
- Dimensionality reduction
- *Encrypting Data *
- Fraud Detection
- Generating Videos with Scene Dynamics
- Grammar Induction
- High Resolution Image Synthesis
- Human Face Recognition
- Image Classification and Localization
- Image Captioning
- Image Generation
- Image Manipulation
- Image Processing
- Image Recognition
- Image Restoration
- Image Segmentation
- Image to Image Translation
- Increase Image Resolution
- Inductive Learning
- Integrated Recognition
- Intelligent Agents
- Intelligent Assistants
- Iris Classification Problem
- Labeling Images
- *Language Generation *
- Language Understanding
- Machine Reading
- Machine Translation
- Mobile Vision
- Multi-document Summarization
- Named Entity Recognition
- Natural Language Processing
- Object Detection
- Object Detection from Video
- Paraphrase Detection
- Pattern Recognition
- Question Answering
- Real time object recognition
- Recommendation Systems
- Scene Classification
- Scene Parsing
- Sequence Labeling
- Self Driving Cars
- Semantic Folding
- Semantic Parsing
- Semantic Segmentation
- Sentiment Analysis
- Sentiment Analysis for Short Texts
- Sequence to Sequence Learning
- Signature Verification Application
- Speech Processing
- Speech Recognition
- Speech Tagging
- Spell Checking
- Structured Prediction
- Super Resolution
- Text Categorization
- Text Classification
- Textual Entailment
- Text Recognition
- Thought Vector
- Transfer Learning
- Urban Computing
- Video-to-Video Synthesis
- Visual Recognition
- Voice Search
This glossary defines general machine learning terms as well as terms specific to TensorFlow.
- TensorFlow Website
- TensorFlow Guide
- TensorFlow Tutorials
- TensorFlow Model Zoo
- TensorFlow Twitter
- TensorFlow Blog
- TensorFlow Course at Stanford
- TensorFlow Roadmap
- TensorFlow White Papers
- TensorFlow YouTube Channel
- TensorFlow Visualization Toolkit
- Get Started with TensorFlow (Google's Tutorial)
- TensorFlow Debugger
- Get started with TensorFlow's High-Level APIs
- Eager execution
- Fast, flexible, and easy-to-use input pipelines
TensorBoard is a suite of web applications for inspecting and understanding your TensorFlow runs and graphs
A JavaScript library for training and deploying ML models inside the browser and on Node.js
- Main Page of TensorFlow.js
- Getting Started
- Setup TensorFlow.js - Browser Setup
- Tutorials & Guides
- API Reference
- Frequently Asked Questions
- Core Concepts in TensorFlow.js
TensorFlow Lite is the official solution for running machine learning models on mobile and embedded devices. It enables on‑device machine learning inference with low latency and a small binary size on Android, iOS, and other operating systems
Google's fast-paced, practical introduction to machine learning
Open source neural network library written in Python. It is capable of running on top of TensorFlow, Microsoft Cognitive Toolkit, Theano, or PlaidML. Designed to enable fast experimentation with deep neural networks, it focuses on being user-friendly, modular, and extensible.
- Keras Documentation
- Keras Google group
- Keras Slack channel
- Keras functional API
- Keras (Wikipedia)
- Keras on GitHub - Deep Learning for humans
- Keras - TensorFlow Guide
- Keras - Learn and use machine learning - TensorFlow Tutorial
Python is a dynamically typed programming language designed by Guido van Rossum. Much like the programming language Ruby, Python was designed to be easily read by programmers. Because of its large following and many libraries, Python can be implemented and used to do anything from webpages to scientific research
- Python.org
- Get Started
- Python 3.7 source code and installers
- Documentation for Python's standard library
- The Python Wiki
- Python Forums
- Python Conferences and Workshops
- Python Developer’s Guide
- Audio/Video Talks about Python
- Python Frequently Asked Questions
- Python Enhancement Proposals (PEPs)
- Python Books
- Python Documentation
- Python Brochure
- Python Source Releases
- Python Setup and Usage
- Python HOWTOs
- Installing Python Modules
- Extending and Embedding
- Python/C API
- What's new in Python 3.7?
- Python Glossary
Describes the standard library that is distributed with Python. It also describes some of the optional components that are commonly included in Python distributions
- Python Standard Library 3.0
- Python Global Module Index
- Python General Index - all functions, classes, terms
- Matplotlib 3.0.2 documentation
- MatplotLib 3.0.2 User´s Guide
Is a reference manual describing the syntax and core semantics of the language
Is a repository of software for the Python programming language. PyPI helps you find and install software developed and shared by the Python community. Learn about installing packages. Package authors use PyPI to distribute their software. Learn how to package your Python code for PyPI
The Jupyter Notebook is an open-source web application that allows you to create and share documents that contain live code, equations, visualizations and narrative text. Uses include: data cleaning and transformation, numerical simulation, statistical modeling, data visualization, machine learning, and much more. The Jupyter Notebook an interactive computing framework, based on a set of open standards for interactive computing.
- Jupyter Notebook Homepage
- https://jupyter.org/documentation
- The Juyter Notebook Stable Version
- Installing Jupyter Notebook
- The Jupyter Notebook Documentation
- Notebook Document Format
- Jupyter Notebook Viewer
- Notebook Widgets
- Jupyter widgets User Guide
- nbconvert - Convert Notebooks to other formats
- Making kernels for Jupyter
- Notebook Basics
- Markdown Cells
- Jupyter Configuration Overview
- Distributing Jupyter Extensions as Python Packages
- GitHub Reference implementation of the Jupyter Notebook format
- Notebook Examples
- Simple Widget Introduction
- What to do when things go wrong *Making a Notebook release*
- (Gitub)IPython Interactive Computing and Visualization Cookbook, Second Edition
- Mininh the Social Web, 2nd Edition
- *Python for Signal Processing - Featuring Ipython Notebook *
- (GitHub)IPython in depth tutorial
- Integrating Machine Learning in Jupyter Notebooks
- Notebooks for Mining the Social Web Book
- Nbviewer for Python for Signal Processing Book
- Notebooks for "Python for Signal Processing" book
- GitHub- Official repository for IPython itself
A collection of awesome markdown goodies (libraries, services, editors, tools, cheatsheets, etc.)