resuldagdanov / carla-rllib-integration

Integration of Ray RLlib into CARLA Autonomous Driving Simulator πŸš™

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Original CARLA RLLib Integration Repository: rllib-integration

Installation Steps

conda create -n carla python=3.7

conda activate carla

pip install -r requirements.txt

Prepare Necessary Directory Exports to Bashrc

gedit ~/.bashrc

export DeFIX_PATH=PATH_TO_MAIN_DeFIX_REPO
export CARLA_ROOT=PATH_TO_CARLA_ROOT_SH

export SCENARIO_RUNNER_ROOT="${DeFIX_PATH}/scenario_runner"
export LEADERBOARD_ROOT="${DeFIX_PATH}/leaderboard"
export PYTHONPATH="${CARLA_ROOT}/PythonAPI/carla/":"${SCENARIO_RUNNER_ROOT}":"${LEADERBOARD_ROOT}":"${CARLA_ROOT}/PythonAPI/carla/dist/carla-0.9.11-py3.7-linux-x86_64.egg":${PYTHONPATH}

source ~/.bashrc

Provision Steps

  • move 'resnet50.zip' file to directory: <DeFIX_PATH/checkpoint/models/>

Training

python3 dqn_train.py dqn_example/dqn_config.yaml --overwrite

About

Integration of Ray RLlib into CARLA Autonomous Driving Simulator πŸš™

License:MIT License


Languages

Language:Python 98.1%Language:Shell 1.9%