Skip to content

[ECCV 2024] HandDGP: Camera-Space Hand Mesh Prediction with Differentiable Global Positioning

License

Notifications You must be signed in to change notification settings

nianticlabs/HandDGP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HandDGP: Camera-Space Hand Mesh Predictions with Differentiable Global Positioning (ECCV 2024)

This is the reference PyTorch implementation for the method described in

HandDGP: Camera-Space Hand Mesh Predictions with Differentiable Global Positioning

Eugene Valassakis Guillermo Garcia-Hernando

Project Page, Video

🗺️ Overview

We provide the model implementation of HandDGP, model weights trained on FreiHAND dataset and the code reproduce our main paper results.

⚙️ Setup

We are going to create a new Mamba environment called handdgp. If you don't have Mamba, you can install it with:

make install-mamba

Then setup the environment with:

make mamba-env
mamba activate handdgp

In the code directory, install the repo as a pip package:

pip install -e .

Accept the licences of MobRecon and MANO and then add to the third party folder:

mkdir third_party
cd third_party
git clone https://github.com/SeanChenxy/HandMesh.git

📦 Trained Models

We provide the model trained on FreiHAND dataset here. Download the model and place it in the weights directory. Alternatively, you can run the following bash script: scripts/download_weights.sh.

💾 Data

Accept the licence and download the FreiHAND eval data from here and extract it in data/freihand/data/freihand/FreiHAND_pub_v2_eval. Alternatively, you can run the following script: scripts/download_data.sh.

🌳 Folder structure

Please make sure to re-create this folder structure:

├── configs
├── data
│   ├── freihand
│   │   ├── FreiHAND_pub_v2_eval
├── scripts
├── outputs
│   ├── <experiment output folders>
├── src
│   ├── <source files>
├── weights
│   ├── <HandDGP weight files>
├── third_party
│   ├── HandMesh
├── LICENSE
├── Makefile
├── pyproject.toml
├── environment.yml
├── README.md
├── setup.py

🚀 Running HandDGP

To run HandDGP, please run the following command from the root folder:

python -m src.run --config_file configs/test_freihand.gin

This will generate an output file in the outputs directory with the test results on FreiHAND dataset in json format that you can directly use in the FreiHAND evaluation code. ``

🙏 Acknowledgements

We would like to thank the authors of the following repositories for their code, models and datasets:

📜 Citation

If you find our work useful in your research please consider citing our paper:

  @inproceedings{handdgp2024,
  title={{HandDGP}: Camera-Space Hand Mesh Prediction with Differentiable Global Positioning},
  author={Valassakis, Eugene and Garcia-Hernando, Guillermo},
  booktitle={Proceedings of the European Conference on Computer Vision (ECCV)},
  year={2024},
}

👩‍⚖️ License

Copyright © Niantic, Inc. 2024. Patent Pending. All rights reserved. Please see the license file for terms.

About

[ECCV 2024] HandDGP: Camera-Space Hand Mesh Prediction with Differentiable Global Positioning

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published