Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix get_output_dim() for ElmoTokenEmbedder #2234

Merged
merged 8 commits into from Dec 28, 2018

Conversation

@tingkai-zhang
Copy link
Contributor

commented Dec 23, 2018

  1. Fix get_output_dim() of ElmoTokenEmbedder with projection
  2. Add test of get_output_dim() of ElmoTokenEmbedder with projection

#2227

Tingkai Zhang added 2 commits Dec 23, 2018
Tingkai Zhang
Tingkai Zhang
add test of get_output_dim()
add test of get_output_dim() in test_forward_works_with_projection_layer()
@tingkai-zhang
Copy link
Contributor Author

left a comment

fix get_output_dim()

@@ -69,7 +69,7 @@ def __init__(self,
self._projection = None

def get_output_dim(self):
return self._elmo.get_output_dim()
return self._projection.out_features

This comment has been minimized.

Copy link
@jihunchoi

jihunchoi Dec 23, 2018

self._projection might be None. I think something similar to how Embedding does would be nice.

self.output_dim = projection_dim or embedding_dim
@overrides
def get_output_dim(self) -> int:
return self.output_dim

Tingkai Zhang added 3 commits Dec 23, 2018
Tingkai Zhang
Tingkai Zhang
add output_dim attribute
add output_dim attribute in `ElmoTokenEmbedder`
Tingkai Zhang
@nelson-liu
Copy link
Member

left a comment

Thanks for the PR, this looks good to me (modulo lint fixes). you'll also have to pull the latest changes from master (you can just press the button in the PR status check box).

@@ -87,10 +87,12 @@ def test_forward_works_with_projection_layer(self):
word2[2] = 1
word2[3] = 0
embedding_layer = ElmoTokenEmbedder.from_params(vocab=None, params=params)
assert embedding_layer.get_output_dim() == 20

This comment has been minimized.

Copy link
@nelson-liu

nelson-liu Dec 27, 2018

Member

looks like there's some whitespace on this line that pylint doesn't like.

input_tensor = torch.LongTensor([[word1, word2]])
embedded = embedding_layer(input_tensor).data.numpy()
assert embedded.shape == (1, 2, 20)

This comment has been minimized.

Copy link
@nelson-liu

nelson-liu Dec 27, 2018

Member

likewise, looks like there's some whitespace on this line that pylint doesn't like.

def get_output_dim(self):
return self._elmo.get_output_dim()
self.output_dim = self._elmo.get_output_dim()

This comment has been minimized.

Copy link
@nelson-liu

nelson-liu Dec 27, 2018

Member

looks like there's some whitespace on this line that pylint doesn't like.

This comment has been minimized.

Copy link
@tingkai-zhang

tingkai-zhang Dec 27, 2018

Author Contributor

Thanks! I should have checked that whitespace problem.

This comment has been minimized.

Copy link
@tingkai-zhang

tingkai-zhang Dec 28, 2018

Author Contributor

@nelson-liu Thanks for your review and help!
I think i still failed the pylint test. It gives the following error detail:

[18:17:00]Using config file /local/deploy/agent2/work/98197cf33cb401e5/.pylintrc
[18:18:48]Process exited with code 1

Is there anything i should fix?

Tingkai Zhang and others added 3 commits Dec 27, 2018
Tingkai Zhang
@nelson-liu

This comment has been minimized.

Copy link
Member

commented Dec 28, 2018

i fixed it, see e57707a . maybe your editor is inserting these whitespaces for some reason?

@nelson-liu nelson-liu merged commit 2f56765 into allenai:master Dec 28, 2018

3 checks passed

Pull Requests (AllenNLP Library) TeamCity build finished
Details
codecov/patch 100% of diff hit (target 90%)
Details
codecov/project 92% (+<1%) compared to 42e0815
Details
@nelson-liu

This comment has been minimized.

Copy link
Member

commented Dec 28, 2018

thanks @tingkai-zhang !

@tingkai-zhang

This comment has been minimized.

Copy link
Contributor Author

commented Dec 28, 2018

No worries! I appreciate your help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
3 participants
You can’t perform that action at this time.