Skip to content

Latest commit

 

History

History
14 lines (12 loc) · 290 Bytes

README.md

File metadata and controls

14 lines (12 loc) · 290 Bytes

ppo_pytorch

PPO implementation using pytorch

Prepare virtual environment using Anaconda

$ conda create -n pytorch python=3.8 anaconda
$ conda install pytorch torchvision cudatoolkit=10.2 -c pytorch
$ pip install gym

Train Pendulum-v0

$ python train_ppo.py