Android application for controlling air hockey robot in real-time using Convolutional Neural Network (Submodule of Deep Learning Air Hockey Robot)
The application infers action that the robot should take by looking at the last 3 frames captured by the camera. Then, the inferred action is sent to Arduino via Bluetooth LE.
Predictions are made by using a convolutional neural network. The network is pretrained with labeled frames generated using Air Hockey Game Simulator, and then trained via DDQN using gym-air-hockey as the environment. Finally, the model is converted from keras to caffe2 using keras-to-caffe2 converter.
git clone https://github.com/arakhmat/perception
Import the project to Android Studio and it will build automatically.
- The application was tested only on Sony Xperia m4 Aqua. It most likely will not convert raw YUV data to RGB image correctly on any other device due to an unusual format of YUV data on Xperia m4 Aqua.
- AI Camera - Demonstration of using Caffe2 inside an Android application
- Android Bluetooth Low Energy (BLE) Example
- How to Communicate with a Custom BLE using an Android App
- HM-10 Bluetooth 4 BLE Modules