[Help Request] Observation
UnicornHJ opened this issue · comments
High Level Description
Hello.
I am currently using smarts for related project development. But have encountered many problems during the process
- I would like to adjust the vehicle observation, add masks to the vehicles not appearing in the scene (which will suddenly appear in the future) and the vehicles that may appear in the future, or modify the observation range, how should I adjust the observation part?
- How should I run the benchmarks such as PPO and SAC in baseline? The official website doesn't give me the way to run the baseline.
Thank you for your help.
Version
1.0
Operating System
ubuntu20.04
Problems
Hello.
I am currently using smarts for related project development. But have encountered many problems during the process
- I would like to adjust the vehicle observation, add masks to the vehicles not appearing in the scene (which will suddenly appear in the future) and the vehicles that may appear in the future, or modify the observation range, how should I adjust the observation part?
- How should I run the benchmarks such as PPO and SAC in baseline? The official website doesn't give me the way to run the baseline.
Thank you for your help.
Hello @UnicornHJ, sorry, I cannot reply right now we will reply tomorrow.
Thank you very much for your reply
Looking forward to your reply, thanks
Hello @UnicornHJ, the MARL benchmark is fairly old and has been moved out of SMARTS since it was connected to smarts==0.4.16
. We store a version of it at https://github.com/smarts-project/smarts-project.rl/tree/master/marl_benchmark.
- If you want to run different algorithms you will need to call into different configurations which are in the
marl_benchmark/agents
directory:python3.8 run.py scenarios/sumo/intersections/4lane -f agents/maddpg/baseline-lane-control.yaml
marl_benchmarks/agents/__init__.py::_make_rllib_config
which really calls down throughFrameStack.get_observation_space
andFrameStack.get_observation_adapter
. This configuration comes through here:
agent:
state:
wrapper:
name: FrameStack
num_stack: 3
FrameStack
is dynamically loaded frommarl_benchmark.wrappers.rllib.__init__.py
. So adding a wrapper there (or injecting the wrapper intomarl_benchmark.wrappers.rllib
's module attributes) would allow selecting it from theyaml
file.- You would want such a wrapper to inherit from
marl_benchmark.wrappers.rllib.wrapper.Wrapper
. - For reference of a
Wrapper
implementation:FrameStack
wrapper usesCalObs
frommarl_benchmark/common.py
the feature names match thecal_*
methods onCalObs
(e.g.steering
inagents/maddpg/baseline-lane-control.yaml
matchesCalObs.cal_steering
.) For feature keys that have non-boolean values they are only added if they are included.
agent:
state:
features:
goal_relative_pos: True # enable (bool)
distance_to_center: True # enable (bool)
speed: True # enable (bool)
steering: True # enable (bool)
heading_errors: [20, continuous] # num_waypoints (int), continuous/spaced (str)
neighbor: 8 # num_closest_neighbors (int)
lane_its_info: False # enable intersection info (bool)
img_gray: 256 # resize grayscale (int)
proximity: False # enable (bool)
- The
CalObs
features are mapped to theSPACE_LIB
spaces, any change to one may need a change to the other. - The configuration for the
observation_adaptor
andobservation_space
is generated within
If you wish the MARL benchmark to be modernized, you would need to put in a request for that.
Thank you very much for your reply.
I will try it.
Also , may I ask if your email can receive emails? A few more questions have just been emailed to you!
Or could you please provide me with another e-mail address?
Hello, @UnicornHJ, I received your email. It looks like it went to spam. I will see if I can answer your questions.
Ego-centric observations (unfortunately) were not native to SMARTS aside from perhaps lidar. The camera observations are top down. So simulating vehicle occlusion would take a bit of work.
I can answer in more detail later by email if you are still interested.
Thank you very much, I am very interested in this area and look forward to your reply!