-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ray.rllib.agents.callbacks
has been deprecated. Use ray.rllib.algorithms.callbacks
instead.
#2
Comments
When I make the above change in utils.rllib.py file, following errors pop up next time I try to run the pre-trained model. (savera) adnan@adnan:~/ws/savera/LearnToMoveUR3$ python main.py --env-id reach --load-from pretrained_models/reach --test Learn more about the most important changes here: In order to fix this problem, do the following:
For your custom (single agent) gym.Env classes:
For your custom RLlib
During handling of the above exception, another exception occurred: ray::RolloutWorker.init() (pid=18987, ip=10.7.19.227, repr=<ray.rllib.evaluation.rollout_worker.RolloutWorker object at 0x7fd497ec3150>) Learn more about the most important changes here: In order to fix this problem, do the following:
For your custom (single agent) gym.Env classes:
For your custom RLlib
The above error has been found in your environment! We've added a module for checking your custom environments. It may cause your experiment to fail if your environment is not set up correctly. You can disable this behavior via calling Learn more about the most important changes here: In order to fix this problem, do the following:
For your custom (single agent) gym.Env classes:
For your custom RLlib
During handling of the above exception, another exception occurred: ray::RolloutWorker.init() (pid=18988, ip=10.7.19.227, repr=<ray.rllib.evaluation.rollout_worker.RolloutWorker object at 0x7f94ed063990>) Learn more about the most important changes here: In order to fix this problem, do the following:
For your custom (single agent) gym.Env classes:
For your custom RLlib
The above error has been found in your environment! We've added a module for checking your custom environments. It may cause your experiment to fail if your environment is not set up correctly. You can disable this behavior via calling Learn more about the most important changes here: In order to fix this problem, do the following:
For your custom (single agent) gym.Env classes:
For your custom RLlib
During handling of the above exception, another exception occurred: ray::RolloutWorker.init() (pid=18989, ip=10.7.19.227, repr=<ray.rllib.evaluation.rollout_worker.RolloutWorker object at 0x7f1059d50a10>) Learn more about the most important changes here: In order to fix this problem, do the following:
For your custom (single agent) gym.Env classes:
For your custom RLlib
The above error has been found in your environment! We've added a module for checking your custom environments. It may cause your experiment to fail if your environment is not set up correctly. You can disable this behavior via calling Learn more about the most important changes here: In order to fix this problem, do the following:
For your custom (single agent) gym.Env classes:
For your custom RLlib
During handling of the above exception, another exception occurred: ray::RolloutWorker.init() (pid=18990, ip=10.7.19.227, repr=<ray.rllib.evaluation.rollout_worker.RolloutWorker object at 0x7f757df7e490>) Learn more about the most important changes here: In order to fix this problem, do the following:
For your custom (single agent) gym.Env classes:
For your custom RLlib
The above error has been found in your environment! We've added a module for checking your custom environments. It may cause your experiment to fail if your environment is not set up correctly. You can disable this behavior via calling Learn more about the most important changes here: In order to fix this problem, do the following:
For your custom (single agent) gym.Env classes:
For your custom RLlib
During handling of the above exception, another exception occurred: ray::RolloutWorker.init() (pid=18987, ip=10.7.19.227, repr=<ray.rllib.evaluation.rollout_worker.RolloutWorker object at 0x7fd497ec3150>) Learn more about the most important changes here: In order to fix this problem, do the following:
For your custom (single agent) gym.Env classes:
For your custom RLlib
The above error has been found in your environment! We've added a module for checking your custom environments. It may cause your experiment to fail if your environment is not set up correctly. You can disable this behavior via calling During handling of the above exception, another exception occurred: Traceback (most recent call last): Learn more about the most important changes here: In order to fix this problem, do the following:
For your custom (single agent) gym.Env classes:
For your custom RLlib
The above error has been found in your environment! We've added a module for checking your custom environments. It may cause your experiment to fail if your environment is not set up correctly. You can disable this behavior via calling |
(https://user-images.githubusercontent.com/118107900/229572253-90f571ad-cea9-4949-80e7-06e745f1f3bc.png)
Hi I have install all the libraries following the readme file. (Ray==2.3.1). But Following error appears. Can you please share which version of ray did u use. Or guide me what I am doing wrong here.
The text was updated successfully, but these errors were encountered: