-
Notifications
You must be signed in to change notification settings - Fork 301
Rework DistilBERT docstrings for progressive disclosure of complexity. #881
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks great! I see a few spots to fix, but can do that as I merge this. Thanks
replaced with a random token from the vocabulary. A selected token | ||
will be left as is with probability | ||
`1 - mask_token_rate - random_token_rate`. | ||
Call arguments: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add newline
left-to-right manner and fills up the buckets until we run | ||
out of budget. It supports an arbitrary number of segments. | ||
Call arguments: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
decrease indent
# Load the preprocessor from a preset. | ||
preprocessor = keras_nlp.models.DistilBertPreprocessor.from_preset("distil_bert_base_en_uncased") | ||
preprocessor = keras_nlp.models.DistilBertPreprocessor.from_preset( | ||
"distil_bert_base_en_uncased" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
indent
Example usage: | ||
Raw string inputs and pretrained backbone. | ||
Raw string data. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This still needs some updates to match the new style.
keras-team#881) * Reworked distil_bert docstrings. * Fixed Typos. * Fixed typo in DistilBERT MaskedLM Preprocessor * Updated distil_bert_classifier.py * Added DistilBertPreprocessor to docs. * Formatted using black. * A few edits * Another fix --------- Co-authored-by: Matt Watson <mattdangerw@gmail.com>
keras-team#881) * Reworked distil_bert docstrings. * Fixed Typos. * Fixed typo in DistilBERT MaskedLM Preprocessor * Updated distil_bert_classifier.py * Added DistilBertPreprocessor to docs. * Formatted using black. * A few edits * Another fix --------- Co-authored-by: Matt Watson <mattdangerw@gmail.com>
Partially fixes: #867
Gist of all docstring snippets.