Skip to content

ArcherShirou/MuZero-PyTorch

 
 

Repository files navigation

MuZero PyTorch

Implementation of Mastering Atari, Go, Chess and Shogi by Planning with a Learned Model by DeepMind
for CartPole-v0 environment.

MuZero + naive tree search is working.
MuZero + monte carlo tree search (MCTS) is now working.

Improvements: more tricks/hacks for better MCTS training.

MCTS results

training_mcts

Naive tree search results

Search in the fully expanded tree at depth n the maximum discounted value (+ discounted rewards).
Take the action which is the first action from the root to the maximum node.

cartpole_naive_tree_search training_naive_tree_search

About

Implementation of MuZero | CartPole

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%