- 6D Rotation Matrix Representation
- High Performance.
- Easy Integration
- Customizability
*** The project is structured into four main files:**
main.py
: The entry point of the project, orchestrating the training, evaluation, and prediction processes.nets.py
: Contains the definitions of neural network models or architectures.datasets.py
: Manages dataset handling, including loading, preprocessing, and augmentations.util.py
: Provides utility functions for data manipulation, visualization, logging, and other support tasks.
The model achieved the following Mean Absolute Error (MAE) metrics across different pose angles:
Backbone | Epochs | Pitch | Yaw | Roll | Params (M) | FLOPS (B) | Pretrained weights |
---|---|---|---|---|---|---|---|
RepVGG-A0 | 90 | 5.4 | 4.3 | 3.8 | 9.1 | 1.5 | model |
RepVGG-A1 | 90 | 5.2 | 3.9 | 3.7 | 14 | 2.6 | model |
RepVGG-A2 | 90 | 4.8 | 3.7 | 3.4 | 28.2 | 5.7 | model |
RepVGG-B0 | 90 | 5.0 | 3.9 | 3.5 | 15.8 | 3.4 | model |
RepVGG-B1 | 90 | 5.0 | 3.9 | 3.5 | 57.4 | 13.1 | model |
RepVGG-B1G2 | 90 | 4.9 | 3.6 | 3.4 | 45.7 | 9.8 | model |
RepVGG-B1G4 | 90 | 4.8 | 3.6 | 3.4 | 39.9 | 8.1 | model |
RepVGG-B2 | 90 | 4.78 | 3.4 | 3.3 | 89 | 20.4 | model |
RepVGG-B2G4 | 90 | 4.8 | 3.6 | 3.4 | 61.7 | 12.6 | model |
RepVGG-B3 | 90 | 4.9 | 3.6 | 3.4 | 123 | 29.2 | model |
RepVGG-B3G4 | 90 | 4.8 | 3.7 | 3.4 | 83.8 | 17.9 | model |
- Clone the repository
- Create a Conda environment using the environment.yml file:
conda env create -f environment.yml
- Activate the Conda environment:
conda activate HPE
- Download the 300W-LP, AFLW2000 Datasets:
- Download the dataset from the official project page.
- Place the downloaded dataset into a directory named 'Datasets'.
To initiate the training process, use the following command:
- Configure your dataset path in main.py for training
- Configure Model name (default A2) in main.py for training
- Run the below command for Single-GPU training
python main.py --train
- Run the below command for Multi-GPU training $ is number of GPUs
bash main.sh $ --train
Configure your dataset path in main.py for testing Run the below command:
bash main.sh $ --test
- Configure your video path in main.py for visualizing the demo
- Run the below command:
python main.py --demo