Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Training Parking_her does not work. #392

Closed
JoshYank opened this issue Dec 5, 2022 · 7 comments
Closed

Training Parking_her does not work. #392

JoshYank opened this issue Dec 5, 2022 · 7 comments

Comments

@JoshYank
Copy link

JoshYank commented Dec 5, 2022

Good afternoon, I was trying to train a policy for the parking-env to test against safety validation methods. When I tried to run the code on colab as it is, I was getting an error when creating the environment. AttributeError: 'ParkingEnv' object has no attribute 'np_random' This error could be solved by reinstalling the highway-env or initially installing an older version of gym and highway. After doing this, an error occurs in creating the model before training. TypeError: init() got an unexpected keyword argument 'create_eval_env'

It would be much appreciated if you have any insight on how to solve this problem. My research focuses more on the verification side than training or developing a controller to test. I don't have as much experience in training controllers with RL.

Training_Code_Error_Parking

@HarshkumarBorad
Copy link

Hello,

You can try with specified version of SB.
stable-baselines3==1.6.1
sb3-contrib==1.6.1

Hope this may help you to solve that issue.

@JoshYank
Copy link
Author

JoshYank commented Dec 7, 2022

This fixed the error that occurred when running the line to make the model. Then, this error comes up when trying to run model.learn().

Training_Code_Error_Parking_2

@JoshYank
Copy link
Author

JoshYank commented Dec 7, 2022

I solved the issue. You also have to set.
!pip install gym==0.21.0
!pip install highway-env==1.5

Working with the parking-env itself, how do we input a desired goal position and initial position/velocity for the agent?

@HarshkumarBorad
Copy link

Hello,

For the goal position, you have to modify the parameter lane.position(lane.length/2, 0) in parking-env and replace it with the desired goal position.
https://github.com/eleurent/highway-env/blob/793c8830ae5be9cbce70f55dfcb33d3c5ed762c3/highway_env/envs/parking_env.py#L153

for the initial position of the vehicle, you have to modify the parameter [i*20, 0] in parking-env and replace it with the desired initial position of the vehicle.
https://github.com/eleurent/highway-env/blob/793c8830ae5be9cbce70f55dfcb33d3c5ed762c3/highway_env/envs/parking_env.py#L148

Thanks and Regards,
Harshkumar Borad

@JoshYank
Copy link
Author

JoshYank commented Dec 8, 2022

Is there any way to do this from the created environment rather than from changing the files? The idea is to model uncertainty of the locations within a single script and run multiple simulations.

@eleurent
Copy link
Collaborator

eleurent commented Dec 8, 2022

Something you could do is to manually override env.goal.position and env.vehicle.position after every call to env.reset()

@araffin
Copy link
Contributor

araffin commented Dec 23, 2022

@JoshYank in case you want to use latest SB3 and highway env version, you can take a look at the instructions in DLR-RM/stable-baselines3#780

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants