duchenzhuang / PPO-pytorch-Mujoco

Implement PPO algorithm on mujoco environment,such as Ant-v2, Humanoid-v2, Hopper-v2, Halfcheeth-v2.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

PPO-pytorch-Mujoco

Implement PPO algorithm on mujoco environment,such as Ant-v2, Humanoid-v2, Hopper-v2, Halfcheeth-v2.

Requirements

  • python 3.7.6
  • gym 0.17.6
  • mujoco_py 2.0.2.10
  • pytorch

Usage

$ python main.py --env_name Hopper-v2

Results

Hopper-v2

image

Humanoid-v2

image

Halfcheetah-v2

image

Ant-v2

image

About

Implement PPO algorithm on mujoco environment,such as Ant-v2, Humanoid-v2, Hopper-v2, Halfcheeth-v2.


Languages

Language:Python 100.0%