HORUS (Holistic Operational Reality for Unified Systems) is an innovative Mixed Reality (MR) application developed for the Meta Quest 3 headset. It provides a comprehensive solution for managing teams of heterogeneous robots in various environments, particularly disaster scenarios.
This initial release focuses on teleoperation capabilities for wheeled robots (ROSbots). Future updates will support heterogeneous robot teams, including legged (Spot) and aerial robots (Airvolute and DJI Tello).
- 🥽 Mixed reality interface for robot control and team management.
- 🤖 Multi-robot task allocation and management.
- 📡 Real-time sensor data visualization.
- ✋ Gesture-based controls for intuitive robot interaction.
- 🚗 Teleoperation Modes:
- Minimap (Ground Station) Mode
- Semi-Immersive Mode
- Full Immersion Mode
- 🎥 Flexible camera visualization.
- 🧑💻 Optimized for Meta Quest 3 headset.
- Download the latest APK from the Releases section.
- Install the APK on your Meta Quest 3 headset using SideQuest or your preferred method for sideloading apps.
Tip
Need help with sideloading? Check out the Meta Support Documentation.
- Install the HORUS Bridge application on your laptop, where the ROS master will be launched. Visit the HORUS Bridge GitHub repository for installation instructions.
- Ensure that both your laptop running HORUS Bridge and the Meta Quest 3 headset are on the same network.
- Launch the HORUS Bridge on your laptop following the instructions in the HORUS Bridge README.
- Set up and connect all robots for the interface. Instructions are provided in the HORUS Bridge repository.
- Put on your Meta Quest 3 headset.
- Navigate to your installed apps and launch HORUS.
- Enter the IP address displayed in the HORUS Bridge log (on your laptop) into the HORUS application login page.
- Start Managing Robots:
- Draw a workspace in the MR environment to create a minimap.
- Use the minimap to:
- View robot status.
- Visualize sensor data.
- Allocate tasks.
- Initiate teleoperation.
- Teleoperation Modes:
- 🗺 Minimap Mode: Navigate robots using a 2D overhead view.
- 🎥 Semi-Immersive Mode: View robot camera feeds on a virtual large screen.
- 🔍 Full Immersion Mode: Experience a direct video feed from the robot's front camera.
- ✅ Develop core HORUS interface in Unity.
- ✅ Implement teleoperation modes for wheeled robots.
- ⬜ Complete multi-robot management for wheeled robots.
- ⬜ Conduct initial user testing and refine the interface.
- ⬜ Extend support for legged (Spot) and aerial (Airvolute, DJI Tello) robots.
- ⬜ Implement advanced trajectory planning and debugging tools.
- ⬜ Develop multi-operator functionality.
- ⬜ Integrate an AI copilot system for operator assistance.
- ⬜ Develop and implement collaborative strategies for heterogeneous robot teams.
- ⬜ Conduct extensive experimental validation.
- ⬜ Optimize the system based on experimental results.
- ⬜ Open-source the HORUS application and release an SDK.
We welcome contributions to the HORUS project! Please read our Contributing Guidelines for more information on how to get started.
Note
Contributions can include new features, bug fixes, and documentation improvements.
This project is licensed under the Apache License 2.0. See the LICENSE file for details.
For questions or support, please contact:
This project is part of a PhD research at the University of Genoa, under the supervision of:
RICE Lab at the University of Genoa