This repository has been archived by the owner on Dec 16, 2022. It is now read-only.
Update the base image in the Dockerfiles. #2298
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
joelgrus
approved these changes
Jan 7, 2019
WrRan
pushed a commit
to WrRan/allennlp
that referenced
this pull request
Jan 8, 2019
DeNeutoy
pushed a commit
that referenced
this pull request
Jan 31, 2019
* Fix bug in uniform_unit_scaling #2239 (#2273) * Fix type annotation for .forward(...) in tutorial (#2122) * Add a Contributions section to README.md (#2277) * script for doing archive surgery (#2223) * script for doing archive surgery * simplify script * Fix spelling in tutorial README (#2283) * fix #2285 (#2286) * Update the `find-lr` subcommand help text. (#2289) * Update the elmo command help text. * Update the find-lr subcommand help text. * Add __repr__ to Vocabulary (#2293) As it currently stands, the following is logged during training: ``` 2019-01-06 10:46:21,832 - INFO - allennlp.common.from_params - instantiating class <class 'allennlp.model s.language_model.LanguageModel'> from params {'bidirectional': False, 'contextualizer': {'bidirectional': False, 'dropout': 0.5, 'hidden_size': 200, 'input_size': 200, 'num_layers': 2, 'type': 'lstm'}, 'dropout ': 0.5, 'text_field_embedder': {'token_embedders': {'tokens': {'embedding_dim': 200, 'type': 'embedding'} }}} and extras {'vocab': <allennlp.data.vocabulary.Vocabulary object at 0x7ff7811665f8>} ``` Note that the `Vocabulary` does not provide any useful information, since it doesn't have `__repr__` defined. This provides a fix. * Update the base image in the Dockerfiles. (#2298) * Don't deprecate bidirectional-language-model name (#2297) * bump version number to v0.8.1 * Bump version numbers to v0.8.2-unreleased * Turn BidirectionalLM into a more-general LanguageModel class (#2264) Fixes #2255 This PR replaces the `BidirectionalLM` class with a more-general `LanguageModel` that can be used in either the unidirectional/forward setting or the bidirectional setting. It also accordingly replaces the `BidirectionalLanguageModelTokenEmbedder` with a `LanguageModelTokenEmbedder`. Also fixes bug in the experiment_unsampled.jsonnet config that was preventing a test from actually being unsampled. TODO: - [x] test the unidirectional case - [x] properly deprecate `BidirectionalLM` and `BidirectionalLanguageModelTokenEmbedder` - [x] check docs for accuracy - [x] fix user-facing training configs * more help info * typo fix * add option '--inplace', '--force' * clearer help text
matt-gardner
pushed a commit
that referenced
this pull request
Mar 18, 2019
* Fix bug in uniform_unit_scaling #2239 (#2273) * Fix type annotation for .forward(...) in tutorial (#2122) * Add a Contributions section to README.md (#2277) * script for doing archive surgery (#2223) * script for doing archive surgery * simplify script * Fix spelling in tutorial README (#2283) * fix #2285 (#2286) * Update the `find-lr` subcommand help text. (#2289) * Update the elmo command help text. * Update the find-lr subcommand help text. * Add __repr__ to Vocabulary (#2293) As it currently stands, the following is logged during training: ``` 2019-01-06 10:46:21,832 - INFO - allennlp.common.from_params - instantiating class <class 'allennlp.model s.language_model.LanguageModel'> from params {'bidirectional': False, 'contextualizer': {'bidirectional': False, 'dropout': 0.5, 'hidden_size': 200, 'input_size': 200, 'num_layers': 2, 'type': 'lstm'}, 'dropout ': 0.5, 'text_field_embedder': {'token_embedders': {'tokens': {'embedding_dim': 200, 'type': 'embedding'} }}} and extras {'vocab': <allennlp.data.vocabulary.Vocabulary object at 0x7ff7811665f8>} ``` Note that the `Vocabulary` does not provide any useful information, since it doesn't have `__repr__` defined. This provides a fix. * Update the base image in the Dockerfiles. (#2298) * Don't deprecate bidirectional-language-model name (#2297) * bump version number to v0.8.1 * Bump version numbers to v0.8.2-unreleased * Turn BidirectionalLM into a more-general LanguageModel class (#2264) Fixes #2255 This PR replaces the `BidirectionalLM` class with a more-general `LanguageModel` that can be used in either the unidirectional/forward setting or the bidirectional setting. It also accordingly replaces the `BidirectionalLanguageModelTokenEmbedder` with a `LanguageModelTokenEmbedder`. Also fixes bug in the experiment_unsampled.jsonnet config that was preventing a test from actually being unsampled. TODO: - [x] test the unidirectional case - [x] properly deprecate `BidirectionalLM` and `BidirectionalLanguageModelTokenEmbedder` - [x] check docs for accuracy - [x] fix user-facing training configs * move some utilities from allennlp/scripts to allennlp/allennlp/tools * make pylint happy * add modules to API doc
reiyw
pushed a commit
to reiyw/allennlp
that referenced
this pull request
Nov 12, 2019
* Fix bug in uniform_unit_scaling allenai#2239 (allenai#2273) * Fix type annotation for .forward(...) in tutorial (allenai#2122) * Add a Contributions section to README.md (allenai#2277) * script for doing archive surgery (allenai#2223) * script for doing archive surgery * simplify script * Fix spelling in tutorial README (allenai#2283) * fix allenai#2285 (allenai#2286) * Update the `find-lr` subcommand help text. (allenai#2289) * Update the elmo command help text. * Update the find-lr subcommand help text. * Add __repr__ to Vocabulary (allenai#2293) As it currently stands, the following is logged during training: ``` 2019-01-06 10:46:21,832 - INFO - allennlp.common.from_params - instantiating class <class 'allennlp.model s.language_model.LanguageModel'> from params {'bidirectional': False, 'contextualizer': {'bidirectional': False, 'dropout': 0.5, 'hidden_size': 200, 'input_size': 200, 'num_layers': 2, 'type': 'lstm'}, 'dropout ': 0.5, 'text_field_embedder': {'token_embedders': {'tokens': {'embedding_dim': 200, 'type': 'embedding'} }}} and extras {'vocab': <allennlp.data.vocabulary.Vocabulary object at 0x7ff7811665f8>} ``` Note that the `Vocabulary` does not provide any useful information, since it doesn't have `__repr__` defined. This provides a fix. * Update the base image in the Dockerfiles. (allenai#2298) * Don't deprecate bidirectional-language-model name (allenai#2297) * bump version number to v0.8.1 * Bump version numbers to v0.8.2-unreleased * Turn BidirectionalLM into a more-general LanguageModel class (allenai#2264) Fixes allenai#2255 This PR replaces the `BidirectionalLM` class with a more-general `LanguageModel` that can be used in either the unidirectional/forward setting or the bidirectional setting. It also accordingly replaces the `BidirectionalLanguageModelTokenEmbedder` with a `LanguageModelTokenEmbedder`. Also fixes bug in the experiment_unsampled.jsonnet config that was preventing a test from actually being unsampled. TODO: - [x] test the unidirectional case - [x] properly deprecate `BidirectionalLM` and `BidirectionalLanguageModelTokenEmbedder` - [x] check docs for accuracy - [x] fix user-facing training configs * move some utilities from allennlp/scripts to allennlp/allennlp/tools * make pylint happy * add modules to API doc
TalSchuster
pushed a commit
to TalSchuster/allennlp-MultiLang
that referenced
this pull request
Feb 20, 2020
* Fix bug in uniform_unit_scaling allenai#2239 (allenai#2273) * Fix type annotation for .forward(...) in tutorial (allenai#2122) * Add a Contributions section to README.md (allenai#2277) * script for doing archive surgery (allenai#2223) * script for doing archive surgery * simplify script * Fix spelling in tutorial README (allenai#2283) * fix allenai#2285 (allenai#2286) * Update the `find-lr` subcommand help text. (allenai#2289) * Update the elmo command help text. * Update the find-lr subcommand help text. * Add __repr__ to Vocabulary (allenai#2293) As it currently stands, the following is logged during training: ``` 2019-01-06 10:46:21,832 - INFO - allennlp.common.from_params - instantiating class <class 'allennlp.model s.language_model.LanguageModel'> from params {'bidirectional': False, 'contextualizer': {'bidirectional': False, 'dropout': 0.5, 'hidden_size': 200, 'input_size': 200, 'num_layers': 2, 'type': 'lstm'}, 'dropout ': 0.5, 'text_field_embedder': {'token_embedders': {'tokens': {'embedding_dim': 200, 'type': 'embedding'} }}} and extras {'vocab': <allennlp.data.vocabulary.Vocabulary object at 0x7ff7811665f8>} ``` Note that the `Vocabulary` does not provide any useful information, since it doesn't have `__repr__` defined. This provides a fix. * Update the base image in the Dockerfiles. (allenai#2298) * Don't deprecate bidirectional-language-model name (allenai#2297) * bump version number to v0.8.1 * Bump version numbers to v0.8.2-unreleased * Turn BidirectionalLM into a more-general LanguageModel class (allenai#2264) Fixes allenai#2255 This PR replaces the `BidirectionalLM` class with a more-general `LanguageModel` that can be used in either the unidirectional/forward setting or the bidirectional setting. It also accordingly replaces the `BidirectionalLanguageModelTokenEmbedder` with a `LanguageModelTokenEmbedder`. Also fixes bug in the experiment_unsampled.jsonnet config that was preventing a test from actually being unsampled. TODO: - [x] test the unidirectional case - [x] properly deprecate `BidirectionalLM` and `BidirectionalLanguageModelTokenEmbedder` - [x] check docs for accuracy - [x] fix user-facing training configs * move some utilities from allennlp/scripts to allennlp/allennlp/tools * make pylint happy * add modules to API doc
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This also upgrades
pip
from 9 to 18.1. We've hadpip
warnings in our builds for a long time and few of our users likely use as old as a version as we have in our continuous build.