Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AMRtotxt Inference on own data #15

Closed
LIU-FAYANG opened this issue Dec 8, 2022 · 2 comments
Closed

AMRtotxt Inference on own data #15

LIU-FAYANG opened this issue Dec 8, 2022 · 2 comments

Comments

@LIU-FAYANG
Copy link

Hi, I'm trying to use the inference script to inference on my own amr graphs but it seems keeps producing the same predictions of the data in the example folder. I tried to delete the cache file inside the example folder then it seems able to work... Could you check the script to see if the cache file is affecting the inference?
Also I'm quite confused why it need train and val dataset to run the inference script, could you explain more about this?
Thank you for your kind help!

@goodbai-nlp
Copy link
Owner

Hi @LIU-FAYANG,

Thanks for your comments. The following are point-by-point responses for your questions:

  1. The cache files are used to accelerate the model when you need to run the model twice (e.g. training multiple times, training + evaluation), thus are meaningless when you do inference on your own AMR graphs once.
  2. If you want to overwrite the cache, use --overwrite_cache True flag in your inference script.
  3. For train/val dataset in the inference, you can take the files in the example directory as train/validation data (they will not be used), we will fix bugs in two weeks.

@goodbai-nlp
Copy link
Owner

Fixed. The train/val dataset will not be used in the inference script.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants