Skip to content

Android application for controlling air hockey robot in real-time using Convolutional Neural Network

Notifications You must be signed in to change notification settings

arakhmati/perception

Repository files navigation

Perception

Android application for controlling air hockey robot in real-time using Convolutional Neural Network (Submodule of Deep Learning Air Hockey Robot)

How it works

The application infers action that the robot should take by looking at the last 3 frames captured by the camera. Then, the inferred action is sent to Arduino via Bluetooth LE.

Predictions are made by using a convolutional neural network. The network is pretrained with labeled frames generated using Air Hockey Game Simulator, and then trained via DDQN using gym-air-hockey as the environment. Finally, the model is converted from keras to caffe2 using keras-to-caffe2 converter.

Screenshots

Launched application

Menu used to connect to a BLE device

Ready to start

During the game (Top prediction is to go southeast (bottom right) ☺)

Prerequisites

Android Studio

Download

git clone https://github.com/arakhmat/perception 

Build

Import the project to Android Studio and it will build automatically.

Limitations

  • The application was tested only on Sony Xperia m4 Aqua. It most likely will not convert raw YUV data to RGB image correctly on any other device due to an unusual format of YUV data on Xperia m4 Aqua.

Acknowledgments

About

Android application for controlling air hockey robot in real-time using Convolutional Neural Network

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages