Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BIT-525] Tokenizer improvements (whitespace-preserving) #856

Merged
merged 3 commits into from
Jul 27, 2022

Conversation

opentaco
Copy link
Contributor

@opentaco opentaco commented Jul 27, 2022

BIT-525 Tokenizer improvements (whitespace-preserving)

Add support for non whitespace-preserving tokenizers.

Prepends strings with a space, in the case of non whitespace-preserving tokenizers like BERT, which should mimic whitespace-preserving token strings more often than not.

Adds error catching on per-UID basis in shapley_base for synapse validation to be more robust.

Prepends strings with a space, in the case of non whitespace-preserving tokenizers like BERT, which should mimic whitespace-preserving token strings more often than not.

Adds error catching on per-UID basis in shapley_base for synapse validation to be more robust.
@opentaco opentaco requested a review from Eugene-hu July 27, 2022 11:04
@coveralls
Copy link

coveralls commented Jul 27, 2022

Pull Request Test Coverage Report for Build 2e371905-94af-4e6c-b746-2528dc9f4f3b

  • 19 of 22 (86.36%) changed or added relevant lines in 2 files are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage increased (+0.05%) to 64.876%

Changes Missing Coverage Covered Lines Changed/Added Lines %
bittensor/utils/tokenizer_utils.py 16 19 84.21%
Totals Coverage Status
Change from base Build 1bfda1b1-1953-44ee-b76e-ec437b11e0ff: 0.05%
Covered Lines: 3936
Relevant Lines: 6067

💛 - Coveralls

Copy link
Contributor

@Eugene-hu Eugene-hu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@@ -18,7 +18,9 @@
# DEALINGS IN THE SOFTWARE.

from transformers import AutoTokenizer
import bittensor
import bittensor
from bittensor.utils.tokenizer_utils import prep_tokenizer
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would we expect users to use the prep_tokenizer function a lot? If so, it might be good to have it as a class function as well

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not really, it's just to prepare the server tokenizer (and the bittensor.tokenizer gets prepared automatically). So clients wouldn't need to run it, for instance.

@opentaco opentaco merged commit df06000 into Synapse Jul 27, 2022
@ifrit98 ifrit98 deleted the BIT-525-tokenizer-improvements branch May 24, 2023 14:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants