Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About running experiments #4

Closed
pprobst opened this issue Apr 20, 2021 · 1 comment
Closed

About running experiments #4

pprobst opened this issue Apr 20, 2021 · 1 comment

Comments

@pprobst
Copy link

pprobst commented Apr 20, 2021

Hello.

I've recently read your paper "Neural Network Heuristics for Classical Planning: A Study of Hyperparameter Space" (ECAI 2020) and I tried to replicate some of your results with the code provided with the paper. However, a colleague and I ran into some trouble at building that version of Neural Fast Downward, and after some frustration, we ended up trying the most recent version of Neural Fast Downward available it this repository. Thankfully, with this version of NFD, building it worked perfectly and I was also successful at running the PyTorch example.

However, we're a bit lost and it's still unclear to me how we'd be able to replicate some of your results, at least with the current version of NFD. If it's not much trouble, could you elaborate on how to replicate a small example like e.g Blocks with varying hidden layers as in Figure 3 of your paper?

I'm very new to Fast Downward and planning in general, so excuse me if any of my questions seem obvious.

@pprobst
Copy link
Author

pprobst commented May 3, 2021

After some effort and reading on the new docs, we were able to do what we wanted. So I'm closing this issue.
Thanks for the updated documentation.

@pprobst pprobst closed this as completed May 3, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant