Real-time air hockey robot controlled by a convolutional neural network.
- Air Hockey Game Simulator is implemented using Pygame library and is used to generate images that are as close as possible to the frames that will be captured by an Android phone during an actual game. The simulator has real-life physics and serves two main purposes:
- Produce frames labeled with the action of programmed AI in order to pretrain convolutional layers of the neural network.
- Act as an environment for a Reinforcement Learning Agent.
- gym-air-hockey is an OpenAI Gym Environment Wrapper around Air Hockey Game Simulator. It is used to determine the rewards, as well as process actions and observations.
- Perception is an Android Application that is used to control the robot during the game. It captures and processes the frames, infers the prediction of the CNN and sends it to Arduino via BluetoothLE.
- This repository contains scripts used to generate labeled frames, pretrain CNN using supervised learning, further train CNN using reinforcement learning, convert keras model to caffe2 using keras-to-caffe2 converter, visualize CNN filters and layer activations using unveiler and etc.
Screenshot of Frames Generated Using Air Hockey Game Simulator
Screenshot of Frames Captured Using Perception
Python3
Android Studio
keras-to-caffe2
git clone --recursive https://github.com/arakhmat/deep-learning-air-hockey-robot
cd deep-learning-air-hockey-robot
cd air-hockey; pip install -e .; cd ..;
cd gym-air-hockey; pip install -e .; cd ..;