Skip to content

hangg7/soar

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SOAR

Project pagearXiv

This is the official implementation for SOAR.


Getting started

SOAR requires Python 3.10 or newer.

  1. Clone the repository.

    git clone https://github.com/hangg7/soar.git --recursive
  2. Install general dependencies.

    cd soar
    pip install .

    It usually takes around 15 minutes to install all the dependencies. Please note that after installing the package, you will find that soar/threestudio-soar is linked to submodules/threestudio/custom and submodules/threestudio/outputs is linked to outputs. If not, first go to the root folder of SOAR repo and run the following command:

    ln -sf $(pwd)/soar/threestudio-soar $(pwd)/submodules/threestudio/custom/ 
    ln -sf $(pwd)/submodules/threestudio/outputs $(pwd)/outputs
  3. Register required models at ICON's website

    • SMPL: SMPL Model (Male, Female)
    • SMPL-X: SMPL-X Model, used for training
    • SMPLIFY: SMPL Model (Neutral)

    ⚠️ Click Register now on all dependencies, then you can download them all with ONE account. Register

  4. Download+unzip required models.

    bash fetch_data.sh

    This will download and unzip the following files:

    data/
    ├── ckpt/
    │   ├── normal.ckpt
    │   └── sam_vit_h_4b8939.pth
    ├── smpl_related/
    │   ├── models/
    │   │   ├── smpl/
    │   │   │   ├── SMPL_{FEMALE,MALE,NEUTRAL}.pkl
    │   │   │   ├── smpl_kid_template.npy
    │   │   └── smplx/
    │   │       ├── SMPLX_{FEMALE,MALE,NEUTRAL}.npz
    │   │       ├── SMPLX_{FEMALE,MALE,NEUTRAL}.pkl
    │   │       ├── smplx_kid_template.npy
    │   │       └── version.txt
    │   └── smpl_data/
    │       ├── smpl_verts.npy
    │       ├── smplx_cmap.npy
    │       ├── smplx_faces.npy
    │       └── smplx_verts.npy
    └── tedra_data/
        ├── faces.txt
        ├── tetrahedrons.txt
        ├── tetgen_{male,female,neutral}_{adult,kid}_structure.npy
        ├── tetgen_{male,female,neutral}_{adult,kid}_vertices.npy
        ├── tetra_{male,female,neutral}_{adult,kid}_smpl.npz
        ├── tetrahedrons_{male,female,neutral}_{adult,kid}.txt
        └── vertices.txt
    
  5. (Optional) Download preprocessed demo data. You can quickly start trying out SOAR with some preprocessed demo sequences. They can be downloaded from Google drive which are originally video clips from Pexels. Put the preprocessed demo data under the folder data/custom and unzip them.

Training

bash ./scripts/run_${video}.sh

where ${video} is the name of the video you want to train on. The training script will train the model, and save the checkpoints in the outputs folder. Training typically takes about 40 minutes on an NVIDIA RTX A5000, though the duration may vary based on video length, resolution, and the number of epochs specified in the training script. The checkpoints will be saved in the outputs/exp-id-${stage}-org/${video}/ckpts folder.

Testing

We have provided a minimal set of code to infererence the trained model. The code is located at soar/threestudio-soar/test/. You could easily add more features to the code to fit your needs.

To obtain the 360-degree rotation video, you can run the following command:

python soar/threestudio-soar/test/render_rot.py --seq_name ${video} --ckpt_path ${ckpt_path}

where ${video} is the name of the video you want to test on, and ${ckpt_path} is the path to the checkpoint you want to test on. The results will be saved in the outputs/test/${video} folder.

Play on custom video

  1. Install OpenPose and SMPLify-X.

    Follow the instructions at OpenPose and SMPLify-X to install the required dependencies.

  2. Preprocess custom videos.

    You can play on your own video by running the following command:

    python preproc/preprocess_custom.py \
        --video-path ./data/custom/<VIDEO.MP4> \
        --data-root ./data/custom/ \
        --openpose-dir /path/to/openpose/ \
        --smplerx-dir /path/to/smplerx/

    Note that we also have a smoothing hyperparameter --smooth_weight which can be used to smooth the smplerx output. The default value is 10000. If this high value makes the output too smooth to be accurate, you can try to lower it to 100 or 0.

    It takes around 30 mins for 400 frames or some big 2K-4K images. For dance_0 it takes around 8 mins.

    Proprocessed data should be saved in the following structure:

    - VIDEO
        - images/*.png
        - keypoints/*.json
        - masks/*.png
        - normal_B/*.png
        - normal_F/*.png
        - smplx/params.pth
            - some SMPLify-X debugging video here as well.
        - video.mp4
    
  3. Setup training script.

    You need to setup the training script by modifying the ./scripts/run_CUSTOM.sh script. You need to change the seq_name to the name of your video(which is the name of the folder in data/custom) , and the prompt to the prompt you want to use.

  4. Run training script.

    Run the following command to train the model on your custom video.

    bash ./scripts/run_CUSTOM.sh
    

    Some tunable hyperparameters are provided in the ./scripts/run_CUSTOM.sh script. You can change them for better performance on your custom video.

Status

This repository currently contains:

  • soar package, which contains reference training and sampling implementation details.
  • Setup instructions.
  • Training script.
  • Testing script. Novel pose rendering script in progress.
  • Dataset preprocessing script.

We are still working on building up a cleanest and most efficient codebase for SOAR. Please stay tuned for more updates.

Acknowledgment

This implementation is built based on GaussianDreamer, Gaussian Surfels, GART, ImageDream, ECON, SMPLer-X. We thank the authors for their wonderful work.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published