Skip to content

Commit

Permalink
Merge pull request #4 from PPierzc/main
Browse files Browse the repository at this point in the history
HRNet Update
  • Loading branch information
PPierzc authored Jul 26, 2022
2 parents 1ea9683 + 9daab71 commit 6634db8
Show file tree
Hide file tree
Showing 52 changed files with 1,328 additions and 3,283 deletions.
5 changes: 4 additions & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,10 @@ RUN python -m pip install --no-cache-dir nflows\
imageio-ffmpeg\
brax\
wandb\
neuralpredictors
neuralpredictors\
yacs

RUN pip install --upgrade pillow

RUN pip install git+https://github.com/sinzlab/neuralpredictors.git
RUN pip install torch-scatter -f https://data.pyg.org/whl/torch-1.9.0+cu111.html
Expand Down
39 changes: 19 additions & 20 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,22 @@ from propose.models.flows import CondGraphFlow
flow = CondGraphFlow.from_pretrained('ppierzc/cgnf/cgnf_human36m:best')
```

## Reproducing results
#### HRNet Loading
You can also load a pretrained HRNet model.
```python
from propose.models.detectors import HRNet

hrnet = HRNet.from_pretrained('ppierzc/cgnf/hrnet:v0')
```
This will load the HRNet model provided in the [repo](https://github.com/leoxiaobin/deep-high-resolution-net.pytorch).
The model loaded here is the `pose_hrnet_w32_256x256` trained on the MPII dataset.

### Requirements
#### Requirements for the package
The requirements for the package can be found in the [requirements.txt](/requirements.txt).

#### Docker
Alternatively, you can use [Docker](https://www.docker.com/) to run the package.
This project requires that you have the following installed:
- `docker`
- `docker-compose`
Expand All @@ -40,15 +54,10 @@ docker pull sinzlab/pytorch:v3.9-torch1.9.0-cuda11.1-dj0.12.7
5. You can now open JupyterLab in your browser at [`http://localhost:10101`](http://localhost:10101).

#### Available Models
| Model Name | description | Artifact path |
| --- |--------------------------------------------------------------------|---------------------------------|
| cGNF Human 3.6m | Model trained on the Human 3.6M dataset with MPII input keypoints. | ```ppierzc/cgnf/cgnf_human36m:best``` |

### Run Evaluation
You can run the evaluation script with the following command:
```
docker-compose run eval --human36m --experiment=cgnf_human36m
```
| Model Name | description | Artifact path | Import Code |
| --- |---------------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------|----------------------------------|
| cGNF Human 3.6m | Model trained on the Human 3.6M dataset with MPII input keypoints. | ```ppierzc/cgnf/cgnf_human36m:best``` | ```from propose.models.flows import CondGraphFlow``` |
| HRNet | Instance of the [official](https://github.com/leoxiaobin/deep-high-resolution-net.pytorch) HRNet model trained on the MPII dataset with w32 and 256x256 | ```ppierzc/cgnf/hrnet:v0``` | ```from propose.models.detectors import HRNet``` |

### Run Tests
To run the tests, from the root directory call:
Expand All @@ -61,17 +70,7 @@ docker-compose run pytest tests
## Data
### Rat7m
You can download the Rat 7M dataset from [here](https://figshare.com/collections/Rat_7M/5295370).
To preprocess the dataset run the following command.
```
docker-compose run preprocess --rat7m
```

### Human3.6M dataset
Due to license restrictions, the dataset is not included in the repository.
You can download it from the official [website](http://vision.imar.ro/human3.6m).

Download the *D3 Positions mono* by subject and place them into the `data/human36m/raw` directory.
Then run the following command.
```
docker-compose run preprocess --human36m
```
2 changes: 0 additions & 2 deletions data/human36m/README.md

This file was deleted.

3 changes: 0 additions & 3 deletions data/human36m/raw/README.md

This file was deleted.

3 changes: 0 additions & 3 deletions data/human36m/test/README.md

This file was deleted.

46 changes: 0 additions & 46 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,52 +27,6 @@ services:
- ./scripts:/scripts
- ./data:/data

python:
&python
image: propose
entrypoint: [ "python" ]

train:
image: propose
volumes:
- .:/src/propose
- ./scripts:/scripts
- ./data:/data
- ./experiments:/experiments
env_file:
- .env
entrypoint: [ "python", "/scripts/train.py" ]

eval:
image: propose
volumes:
- .:/src/propose
- ./scripts:/scripts
- ./data:/data
- ./experiments:/experiments
env_file:
- .env
entrypoint: [ "python", "/scripts/eval.py" ]

sweep:
image: propose
volumes:
- .:/src/propose
- ./scripts:/scripts
- ./data:/data
- ./sweeps:/sweeps
env_file:
- .env
entrypoint: [ "python", "/scripts/sweep.py" ]

preprocess:
image: propose
volumes:
- .:/src/propose
- ./scripts:/scripts
- ./data:/data
entrypoint: [ "python", "/scripts/preprocess.py" ]

pytest:
<<: *common
volumes:
Expand Down
47 changes: 0 additions & 47 deletions experiments/human36m/mpii-dev.yaml

This file was deleted.

45 changes: 0 additions & 45 deletions experiments/human36m/mpii-prod-large.yaml

This file was deleted.

46 changes: 0 additions & 46 deletions experiments/human36m/mpii-prod-multi-sample.yaml

This file was deleted.

45 changes: 0 additions & 45 deletions experiments/human36m/mpii-prod-var.yaml

This file was deleted.

45 changes: 0 additions & 45 deletions experiments/human36m/mpii-prod-xlarge.yaml

This file was deleted.

Loading

0 comments on commit 6634db8

Please sign in to comment.