starcraft2-ai / python-env

Python environment of StarCraft 2 for NYU HPC

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Python environment for StarCraft 2 Reinforcement Learning

Instructions

  1. Download this repo
  2. Run cd s2py3
  3. Run source bin/activate on a Linux machine
  4. Be happy.

About

Python environment of StarCraft 2 for NYU HPC

License:MIT License


Languages

Language:Python 72.1%Language:C++ 27.3%Language:C 0.6%Language:Fortran 0.0%Language:CSS 0.0%Language:JavaScript 0.0%Language:Shell 0.0%Language:CMake 0.0%