4SkyNet / multiagent-competition

Repository for competitive multi-agent environments

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Competitive Multi-Agent Environments

This repository contains the environments for the paper Emergent Complexity via Multi-agent Competition

Dependencies

Use pip install -r requirements.txt to install dependencies. If you haven't used MuJoCo before, please refer to the installation guide. The code has been tested with the following dependencies:

Installing Package

After installing all dependencies, make sure gym works with support for MuJoCo environments. Next install gym-compete package as:

cd gym-compete
pip install -e .

Check install is successful by coming out of the directory and trying import gym_compete in python console. Some users might require a sudo pip install.

Trying the environments

Agent policies are provided for the various environments in folder agent-zoo. To see a demo of all the environments do:

bash demo_tasks.sh all

To instead try a single environment use:

bash demo_tasks.sh <task>

where <task> is one of: run-to-goal-humans, run-to-goal-ants, you-shall-not-pass, sumo-ants, sumo-humans and kick-and-defend

About

Repository for competitive multi-agent environments


Languages

Language:Python 97.6%Language:Shell 2.4%