Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TP llama with continuous batching #2709

Merged
merged 21 commits into from
Dec 14, 2023

Conversation

mreso
Copy link
Collaborator

@mreso mreso commented Oct 12, 2023

Description

Please read our CONTRIBUTING.md prior to creating your first pull request.

Please include a summary of the feature or issue being fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.

Fixes #(issue)

Type of change

Please delete options that are not relevant.

  • New feature (non-breaking change which adds functionality)

Feature/Issue validation/testing

Please describe the Unit or Integration tests that you ran to verify your changes and relevant result summary. Provide instructions so it can be reproduced.
Please also list any relevant details for your test configuration.

  • pytest test/pytest/test_tp_llama.py
=========================================================================================================================== test session starts ============================================================================================================================
platform linux -- Python 3.10.12, pytest-7.3.1, pluggy-1.3.0
rootdir: /home/ubuntu/serve
plugins: mock-3.10.0, cov-4.1.0
collected 8 items

test/pytest/test_tp_llama.py ........                                                                                                                                                                                                                                [100%]

============================================================================================================================= warnings summary =============================================================================================================================
test/pytest/test_tp_llama.py::test_handler
  /home/ubuntu/serve/ts/torch_handler/base_handler.py:13: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
    from pkg_resources import packaging

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
================================================================================================================= 8 passed, 1 warning in 62.65s (0:01:02) ==================================================================================================================

Checklist:

  • Did you have fun?
  • Have you added tests that prove your fix is effective or that this feature works?
  • Has code been commented, particularly in hard-to-understand areas?

@codecov
Copy link

codecov bot commented Oct 12, 2023

Codecov Report

Merging #2709 (97d5a32) into master (7f4419f) will decrease coverage by 0.08%.
The diff coverage is n/a.

❗ Current head 97d5a32 differs from pull request most recent head b7230ac. Consider uploading reports for the commit b7230ac to get more accurate results

@@            Coverage Diff             @@
##           master    #2709      +/-   ##
==========================================
- Coverage   72.44%   72.36%   -0.08%     
==========================================
  Files          85       85              
  Lines        3963     3963              
  Branches       58       58              
==========================================
- Hits         2871     2868       -3     
- Misses       1088     1091       +3     
  Partials        4        4              

see 1 file with indirect coverage changes

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@mreso mreso marked this pull request as ready for review October 30, 2023 18:24
@mreso mreso changed the title [WIP]TP llama with continuous batching TP llama with continuous batching Oct 30, 2023
Copy link
Collaborator

@HamidShojanazeri HamidShojanazeri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks @mreso , LGTM, and test ran fine too.

pytest test/pytest/test_tp_llama.py::test_continuous_batching_tp_llama
================================================================================= test session starts ==================================================================================
platform linux -- Python 3.10.13, pytest-7.4.3, pluggy-1.3.0
rootdir: /home/ubuntu/serve
collected 1 item                                                                                                                                                                       

test/pytest/test_tp_llama.py .                                                                                                                                                   [100%]

================================================================================== 1 passed in 26.15s ==================================================================================

@HamidShojanazeri HamidShojanazeri added this pull request to the merge queue Dec 14, 2023
Merged via the queue into master with commit df94a56 Dec 14, 2023
13 checks passed
@chauhang chauhang added this to the v0.10.0 milestone Feb 27, 2024
@lxning lxning mentioned this pull request Feb 28, 2024
10 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants