Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference #1

Closed
Rasipuram opened this issue Mar 11, 2020 · 7 comments
Closed

Inference #1

Rasipuram opened this issue Mar 11, 2020 · 7 comments

Comments

@Rasipuram
Copy link

Dear Authors,

Thank you for providing code.

I want to use the trained model to generate responses for a new text. Is there is a way to do the inference part for a new text.

@manzar96
Copy link

manzar96 commented Apr 2, 2020

Did you finally find the way?

@zlinao
Copy link
Member

zlinao commented Apr 3, 2020

Hi, sorry for late reply. I added a simple interact script. Try the following command:
python3 interact.py --model experts --label_smoothing --noam --emb_dim 300 --hidden_dim 300 --hop 1 --heads 2 --topk 5 --cuda --pretrain_emb --softmax --basic_learner --save_path save/checkpoint

@zlinao zlinao closed this as completed Apr 3, 2020
@Rasipuram
Copy link
Author

Thank you for sharing the code. But got this error while running the interact.py code

TypeError: 'Transformer_experts' object is not subscriptable

Requirements are installed.

Thanks and regards

@zlinao
Copy link
Member

zlinao commented Apr 6, 2020

Hi, can you share the full error message?

@Rasipuram
Copy link
Author

Please find the message.loading weights
Traceback (most recent call last):
File "interact.py", line 69, in
model = Transformer_experts(vocab,decoder_number=program_number, model_file_path=config.save_path, is_eval=True)
File "/home/ubuntu/Desktop/Sowmya/Conversation/MoEL_NO_DATA/model/transformer_mulexpert.py", line 363, in init
self.encoder.load_state_dict(state['encoder_state_dict'])
TypeError: 'Transformer_experts' object is not subscriptable

Thanks and regards

@zlinao
Copy link
Member

zlinao commented Apr 6, 2020

What command did you run? Did you give the correct model checkpoint? python3 interact.py --model experts --label_smoothing --noam --emb_dim 300 --hidden_dim 300 --hop 1 --heads 2 --topk 5 --cuda --pretrain_emb --softmax --basic_learner --save_path save/checkpoint

@Rasipuram
Copy link
Author

Yes. It worked. Thank you very much.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants