We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
In train_model.py, "n_characters" is defined as 261. However, in pretrained models's configs, n_characters is set to 262. Any particular reason?
Test model : https://raw.githubusercontent.com/allenai/bilm-tf/master/tests/fixtures/model/options.json Pretrained model : https://s3-us-west-2.amazonaws.com/allennlp/models/elmo/2x4096_512_2048cnn_2xhighway/elmo_2x4096_512_2048cnn_2xhighway_options.json
Both models have n_characters=262
Moreover, while reading a pre-trained model, we increase the size by one to add padding
bilm-tf/bilm/model.py
Line 220 in 81a4b54
bilm-tf/bilm/data.py
Line 120 in 81a4b54
The text was updated successfully, but these errors were encountered:
See the comments I just added to the README that explain this: https://github.com/allenai/bilm-tf/blob/master/README.md#whats-the-deal-with-n_characters-and-padding
Sorry, something went wrong.
No branches or pull requests
In train_model.py, "n_characters" is defined as 261. However, in pretrained models's configs, n_characters is set to 262. Any particular reason?
Test model : https://raw.githubusercontent.com/allenai/bilm-tf/master/tests/fixtures/model/options.json
Pretrained model : https://s3-us-west-2.amazonaws.com/allennlp/models/elmo/2x4096_512_2048cnn_2xhighway/elmo_2x4096_512_2048cnn_2xhighway_options.json
Both models have n_characters=262
Moreover, while reading a pre-trained model, we increase the size by one to add padding
bilm-tf/bilm/model.py
Line 220 in 81a4b54
But we already have a special char for padding
bilm-tf/bilm/data.py
Line 120 in 81a4b54
The text was updated successfully, but these errors were encountered: