Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dependency on multiworld even when not using it #60

Closed
dwiel opened this issue Jun 12, 2019 · 4 comments
Closed

dependency on multiworld even when not using it #60

dwiel opened this issue Jun 12, 2019 · 4 comments

Comments

@dwiel
Copy link
Contributor

dwiel commented Jun 12, 2019

I am using a custom environment, yet it seems that HER depends on multiworld anyway. The documentation says this shouldn't be necessary. Here is the stacktrace:

    from rlkit.samplers.data_collector import GoalConditionedPathCollector
  File "/root/src/sandbox/rlkit/rlkit/samplers/data_collector/__init__.py", line 6, in <module>
    from rlkit.samplers.data_collector.path_collector import (
  File "/root/src/sandbox/rlkit/rlkit/samplers/data_collector/path_collector.py", line 3, in <module>
    from rlkit.envs.vae_wrapper import VAEWrappedEnv
  File "/root/src/sandbox/rlkit/rlkit/envs/vae_wrapper.py", line 11, in <module>
    from multiworld.core.multitask_env import MultitaskEnv
ModuleNotFoundError: No module named 'multiworld.core'
@vitchyr
Copy link
Collaborator

vitchyr commented Jun 12, 2019

Good point. Hmmm, I'm not sure if I want to remove the dependence on MultitaskEnv since that defines the interface. The env_util functions could also be copy/pasted from multiworld, though it seems like a bit redundant.

If you don't want to install multiworld, you can copy/paste the code from here, but otherwise I'm open to suggestions for how else you might fix this.

@dwiel
Copy link
Contributor Author

dwiel commented Jun 12, 2019 via email

@CarloLucibello
Copy link

multiworld is needed to run the DQN example on CartPole as well, which is not ideal:

$ python examples/dqn_and_double_dqn.py 
Traceback (most recent call last):
  File "examples/dqn_and_double_dqn.py", line 12, in <module>
    from rlkit.torch.dqn.dqn import DQNTrainer
  File "/home/carlo/Git/rlkit/rlkit/torch/dqn/dqn.py", line 10, in <module>
    from rlkit.torch.torch_rl_algorithm import TorchTrainer
  File "/home/carlo/Git/rlkit/rlkit/torch/torch_rl_algorithm.py", line 7, in <module>
    from rlkit.core.batch_rl_algorithm import BatchRLAlgorithm
  File "/home/carlo/Git/rlkit/rlkit/core/batch_rl_algorithm.py", line 4, in <module>
    from rlkit.core.rl_algorithm import BaseRLAlgorithm
  File "/home/carlo/Git/rlkit/rlkit/core/rl_algorithm.py", line 8, in <module>
    from rlkit.samplers.data_collector import DataCollector
  File "/home/carlo/Git/rlkit/rlkit/samplers/data_collector/__init__.py", line 6, in <module>
    from rlkit.samplers.data_collector.path_collector import (
  File "/home/carlo/Git/rlkit/rlkit/samplers/data_collector/path_collector.py", line 3, in <module>
    from rlkit.envs.vae_wrapper import VAEWrappedEnv
  File "/home/carlo/Git/rlkit/rlkit/envs/vae_wrapper.py", line 11, in <module>
    from multiworld.core.multitask_env import MultitaskEnv
ModuleNotFoundError: No module named 'multiworld'

@vitchyr
Copy link
Collaborator

vitchyr commented Jun 25, 2019

Addressed with 60d1ab8. Thanks for the feedback!

@vitchyr vitchyr closed this as completed Jun 25, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants