Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable batch processing in scriptable tokenizer example #2130

Merged
merged 2 commits into from
Feb 16, 2023

Conversation

mreso
Copy link
Collaborator

@mreso mreso commented Feb 15, 2023

Description

Enables the processing of multiple examples in one batch for the scriptable tokenizer example:
serve/examples/text_classification_with_scriptable_tokenizer

Fixes #(issue)

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • New feature (non-breaking change which adds functionality)
  • This change requires a documentation update

Feature/Issue validation/testing

Please describe the Unit or Integration tests that you ran to verify your changes and relevant result summary. Provide instructions so it can be reproduced.
Please also list any relevant details for your test configuration.

  • Test A
cd serve/examples/text_classification_with_scriptable_tokenizer
wget https://bert-mar-file.s3.us-west-2.amazonaws.com/text_classification_with_scriptable_tokenizer/model.pt

python script_tokenizer_and_model.py model.pt model_jit.pt

# add print of model input and output in handler.py for test

torch-model-archiver --model-name scriptable_tokenizer --version 1.0 --serialized-file model_jit.pt --handler handler.py --extra-files "index_to_name.json"

mkdir model_store
mv scriptable_tokenizer.mar model_store/

torchserve --start --model-store model_store –ncs

curl -v -X POST "http://localhost:8081/models?initial_workers=1&url=scriptable_tokenizer.mar&batch_size=8"

curl http://127.0.0.1:8080/predictions/scriptable_tokenizer -T sample_text.txt& curl http://127.0.0.1:8080/predictions/scriptable_tokenizer -T sample_text.txt&

Logs for Test A

2023-02-15T06:17:58,582 [WARN ] W-9000-scriptable_tokenizer_1.0-stderr MODEL_LOG -   data = F.softmax(data)
2023-02-15T06:17:58,582 [INFO ] W-9000-scriptable_tokenizer_1.0-stdout MODEL_LOG - ["the rock is destined to be the 21st century 's new `` conan '' and that he 's going to make a splash even greater than arnold schwar
zenegger , jean-claud van damme or steven segal .\n", "the rock is destined to be the 21st century 's new `` conan '' and that he 's going to make a splash even greater than arnold schwarzenegger , jean-claud van damm
e or steven segal .\n"]
2023-02-15T06:17:58,583 [INFO ] W-9000-scriptable_tokenizer_1.0-stdout MODEL_LOG - tensor([[-4.2065,  4.3875],
2023-02-15T06:17:58,583 [INFO ] W-9000-scriptable_tokenizer_1.0-stdout MODEL_LOG -         [-4.2065,  4.3875]])
2023-02-15T06:17:58,583 [INFO ] W-9000-scriptable_tokenizer_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 2782
  • Test B
    Logs for Test B

Checklist:

  • Did you have fun?
  • Have you added tests that prove your fix is effective or that this feature works?
  • Has code been commented, particularly in hard-to-understand areas?
  • Have you made corresponding changes to the documentation?

@agunapal
Copy link
Collaborator

@mreso , Looks like this might be updated in the base_handler for the next release. #2121

@mreso
Copy link
Collaborator Author

mreso commented Feb 15, 2023 via email

@mreso mreso force-pushed the feature/batching_in_scriptable_tokenizer branch from 856b077 to bfd2b85 Compare February 16, 2023 00:35
@codecov
Copy link

codecov bot commented Feb 16, 2023

Codecov Report

Merging #2130 (92971cb) into master (8b3ae1e) will not change coverage.
The diff coverage is n/a.

❗ Current head 92971cb differs from pull request most recent head 15cc1dd. Consider uploading reports for the commit 15cc1dd to get more accurate results

@@           Coverage Diff           @@
##           master    #2130   +/-   ##
=======================================
  Coverage   53.36%   53.36%           
=======================================
  Files          71       71           
  Lines        3225     3225           
  Branches       56       56           
=======================================
  Hits         1721     1721           
  Misses       1504     1504           

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@mreso mreso requested a review from msaroufim February 16, 2023 01:56
@msaroufim msaroufim merged commit 485ebf8 into master Feb 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants