Through SignSense, we aim to make it possible for disabled people to be able to communicate effectively by detecting and translating sign language based on the American Sign Language system.
- General info
- Technologies
- [Usage](#how to use)
- [Project Status]
- [Contributors]
This project is a part of the BitBox Hackathon. **Problem Statement: Let's say you have a friend named John who has Aphasia. John has a disorder that affects his communication. He has difficulty in communicating verbally. **How we plan to solve it: Through SignSense, other people will be able to understand the sign language John uses and at the same time John will have an application to translate his sign language into normal text.
Project is created with:
- Python
- OpenCv
- MediaPipe
- TensorFlow
- Pygame
- NumPy
The interface provides translated text for the respective hand gesture detected by the webcam installed on your device. We also provide text to speech conversion if the user wants their message to be relayed in audible form.
The project is still under development as there are various hand gestures yet to be trained and modelled. We are also striving to include various updates and improvements for faster and more effective communication.
We aim to improve our project by implementing the following features:
- Try to build an app interface to make this technology more accessible.
- Try to translate hand gestures in more than one language to help a wider diversity of people.
- Try to recognise gestures which are heavily motion-based and are complex to be distinguished. ##Contributors Soham Kukreti Yuvraj Rathi Satyam Rathi Sanvi Sharma