-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Categorization + Parallelization + Code cleaning #5
Conversation
avv-va
commented
Jun 14, 2024
•
edited
Loading
edited
- Introduce the concept of TeamType and AgentPerformance for training/evaluating the primary agent with agents with certain level of performance
- Enable parallelization for generating the FCP agent population
- Clean the code
scripts/run_overcooked_game.py
Outdated
tm1 = load_agent(Path('/home/ava/Research/Codes/MHRI/multiHRI/agent_models/sp_det'), args) | ||
tm2 = load_agent(Path('/home/ava/Research/Codes/MHRI/multiHRI/agent_models/sp_det'), args) | ||
tm1 = load_agent(Path('/home/ava/Desktop/sp_det'), args) | ||
tm2 = load_agent(Path('/home/ava/Desktop/sp_det'), args) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we store our trained agent in a specific folder under multiHRI?
(It will make our experience working on different computer easier.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Ttopiac Yeah, but this is the code for loading the agents not saving them. The code for saving the agents already saves it to multiHRI/agent_models
.
If you want to load your agents from the default path you could change the path in here to agent_models/specific_file_path
, in here I was testing some stuff which I did not wanted to put in the agent_models folder, that's why it's using an absolute path.