The Multimodal Android Quiz Application is a project aimed at developing an interactive quiz application that utilizes hand gestures for selecting quiz answers. The application incorporates various technologies, including Figma for UI/UX design, React Native for front-end development, MongoDB for database management, Express and Node.js for backend development, Python for machine learning models, and APIs for model integration. Convolutional Neural Networks (CNNs) are used for model training.
The objectives of the project are as follows:
- Develop a user-friendly and visually appealing UI/UX design using Figma.
- Implement the front-end of the application using React Native.
- Manage the database using MongoDB for efficient data storage and retrieval.
- Build the backend using Express and Node.js to handle application logic and API integration.
- Utilize Python for training machine learning models for hand gesture recognition.
- Integrate the machine learning models with APIs to enable real-time quiz answer selection.
- Employ CNNs for model training to achieve high accuracy.
- Implement hand gesture recognition to allow users to select quiz answers using hand gestures.
Demo Video Link: https://drive.google.com/drive/folders/1KCmJev2TkMfPkxZ5iwlgKbUSJdSevLQR?usp=sharing