Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extracting Embedding #10

Closed
funihang opened this issue Feb 26, 2022 · 1 comment
Closed

Extracting Embedding #10

funihang opened this issue Feb 26, 2022 · 1 comment

Comments

@funihang
Copy link

funihang commented Feb 26, 2022

Hello,

I found a similar issue ( #5), but when I try to extract the embedding according to your instruction, the dimension of the transformer_output doesn't seem to fit the input. The dimension of the transformer_output is like [micro-batch-size, max seg length, hidden size], but how does this output match the input? I checked the bert model and its output should be like output1, output2. The output2 is the embedding that can match the input. But in your code, the output2 is None. I wonder how we can get the embedding corresponding to the inputs.

Thanks

@Yijia-Xiao
Copy link
Collaborator

Hello @funihang ,

Sorry for the late reply.

Thanks for bringing this up. I cannot find a variable named output2 in the transformer.py file. So I am not sure which variable you are referring to. Please specify.


Best,
Yijia

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants