Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

two bugs #5

Closed
ExitPath opened this issue Aug 31, 2021 · 3 comments
Closed

two bugs #5

ExitPath opened this issue Aug 31, 2021 · 3 comments

Comments

@ExitPath
Copy link

Hello, excuse me, I am very interested in this job, but I find two bugs when running the code.

  1. There is no initialization parameter args.in_attn in workspace/transformer/main_cp.py;
  2. The function self.transformer_encoder(pos_emb, attn_mask, emb_emotion=emo_embd)) in workspace/transformer/models.py has no emb_emotion parameter.
@annahung31
Copy link
Owner

Hi,

Thanks for point that out.

The in_attn parameter is used for an experiment that changes the architecture in the encoder. However, the attempt doesn't succeed, so we don't use it for the paper. I've deleted the parameter and also the different way to call self.transformer_encoder. Please pull the new version, thanks!

@ExitPath
Copy link
Author

ExitPath commented Sep 1, 2021

Thanks for your reply, but The function self.transformer_encoder(pos_emb, attn_mask ) in workspace/transformer/models.py still has a bug, this is

"h, layer_outputs = self.transformer_encoder(pos_emb, attn_mask) # y: b x s x d_model
ValueError: too many values to unpack (expected 2)",

which means that self.transformer_encoder() has only one output instead of two.

and What is the version of your fast_transformers?

@annahung31
Copy link
Owner

The additional output also comes from my modified fast-transformer, so I already deleted it, please pull the newest version again. Thanks for letting me know and apologized for the inconvenience!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants