Skip to content

Holistic Operational Reality for Unified Systems - A mixed reality application for managing teams of heterogeneous robots

License

Notifications You must be signed in to change notification settings

RICE-unige/horus

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HORUS: Holistic Operational Reality for Unified Systems 🚀


📚 Table of Contents


🔍 Overview

HORUS (Holistic Operational Reality for Unified Systems) is an innovative Mixed Reality (MR) application developed for the Meta Quest 3 headset. It provides a comprehensive solution for managing teams of heterogeneous robots in various environments, particularly disaster scenarios.

Current Version: 0.0.1

This initial release focuses on teleoperation capabilities for wheeled robots (ROSbots). Future updates will support heterogeneous robot teams, including legged (Spot) and aerial robots (Airvolute and DJI Tello).


🌟 Features

  • 🥽 Mixed reality interface for robot control and team management.
  • 🤖 Multi-robot task allocation and management.
  • 📡 Real-time sensor data visualization.
  • Gesture-based controls for intuitive robot interaction.
  • 🚗 Teleoperation Modes:
    • Minimap (Ground Station) Mode
    • Semi-Immersive Mode
    • Full Immersion Mode
  • 🎥 Flexible camera visualization.
  • 🧑‍💻 Optimized for Meta Quest 3 headset.

🛠 Installation

  1. Download the latest APK from the Releases section.
  2. Install the APK on your Meta Quest 3 headset using SideQuest or your preferred method for sideloading apps.

Tip

Need help with sideloading? Check out the Meta Support Documentation.


🎮 Usage

Prerequisites

  1. Install the HORUS Bridge application on your laptop, where the ROS master will be launched. Visit the HORUS Bridge GitHub repository for installation instructions.
  2. Ensure that both your laptop running HORUS Bridge and the Meta Quest 3 headset are on the same network.

Setup

  1. Launch the HORUS Bridge on your laptop following the instructions in the HORUS Bridge README.
  2. Set up and connect all robots for the interface. Instructions are provided in the HORUS Bridge repository.

Using the HORUS Application

  1. Put on your Meta Quest 3 headset.
  2. Navigate to your installed apps and launch HORUS.
  3. Enter the IP address displayed in the HORUS Bridge log (on your laptop) into the HORUS application login page.
  4. Start Managing Robots:
    • Draw a workspace in the MR environment to create a minimap.
    • Use the minimap to:
      • View robot status.
      • Visualize sensor data.
      • Allocate tasks.
      • Initiate teleoperation.
  5. Teleoperation Modes:
    • 🗺 Minimap Mode: Navigate robots using a 2D overhead view.
    • 🎥 Semi-Immersive Mode: View robot camera feeds on a virtual large screen.
    • 🔍 Full Immersion Mode: Experience a direct video feed from the robot's front camera.

🚧 Project Roadmap

Year 1 (Current)

  • ✅ Develop core HORUS interface in Unity.
  • ✅ Implement teleoperation modes for wheeled robots.
  • ⬜ Complete multi-robot management for wheeled robots.
  • ⬜ Conduct initial user testing and refine the interface.

Year 2

  • ⬜ Extend support for legged (Spot) and aerial (Airvolute, DJI Tello) robots.
  • ⬜ Implement advanced trajectory planning and debugging tools.
  • ⬜ Develop multi-operator functionality.
  • ⬜ Integrate an AI copilot system for operator assistance.

Year 3

  • ⬜ Develop and implement collaborative strategies for heterogeneous robot teams.
  • ⬜ Conduct extensive experimental validation.
  • ⬜ Optimize the system based on experimental results.
  • ⬜ Open-source the HORUS application and release an SDK.

🤝 Contributing

We welcome contributions to the HORUS project! Please read our Contributing Guidelines for more information on how to get started.

Note

Contributions can include new features, bug fixes, and documentation improvements.


📝 License

This project is licensed under the Apache License 2.0. See the LICENSE file for details.


📬 Contact

For questions or support, please contact:


💡 Acknowledgments

This project is part of a PhD research at the University of Genoa, under the supervision of:


Developed by

RICE Lab at the University of Genoa