Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add prediction logging to AWS Lambda deployment #790

Merged
merged 14 commits into from
Jun 16, 2020

Conversation

jackyzha0
Copy link
Collaborator

Description

  • fix some basic typos in Lambda utils
  • made bad python version during Lambda deploy error more descriptive
  • add prediction logging to AWS Lambda endpoint

Motivation and Context

How Has This Been Tested?

Screen Shot 2020-06-12 at 10 34 50 AM

Types of changes

  • Breaking change (fix or feature that would cause existing functionality to change)
  • New feature (non-breaking change which adds functionality)
  • Bug fix (non-breaking change which fixes an issue)
  • Documentation
  • Test, CI, or build
  • None

Components (if applicable)

  • BentoService (model packaging, dependency management, handler definition)
  • Model Artifact (model serialization, multi-framework support)
  • Model Server (mico-batching, logging, metrics, tracing, benchmark, OpenAPI)
  • YataiService (model management, deployment automation)
  • Documentation

Checklist:

  • My code follows the bentoml code style, both ./dev/format.sh and
    ./dev/lint.sh script have passed
    (instructions).
  • My change requires a change to the documentation.
  • I have updated the documentation accordingly.
  • My change requires a change in bentoml/gallery example notebooks
  • I have sent a pull request to bentoml/gallery to make that change

parano and others added 4 commits June 9, 2020 21:20
* make sagemaker docker image have same file structure

* use consistent file names

* move operator code out of __init__ to avoid loading unused code in model server startup

* refactor deployment validator

* reorganize bento repository code

* deployment valiator test&linting error fix

* more repository code cleanup

* renaming and adding inline comments

* move out lambda operator code to separate file
@parano parano requested a review from yubozhao June 12, 2020 18:21
@parano parano changed the title Lambda logging Add prediction logging to AWS Lambda deployment Jun 12, 2020
@parano
Copy link
Member

parano commented Jun 12, 2020

Hi @jackyzha0, thanks for the PR! Looks like the AWS lambda related unit test is failing now, you can find more details by clicking the Travis CI links on this page below. https://travis-ci.org/github/bentoml/BentoML/jobs/697719556

You can also run that test locally and see if you could reproduce the error:

 $ pytest tests/deployment/aws_lambda/test_aws_lambda_deployment_operator.py

return bento_service_api.handle_aws_lambda_event(event)
print(f'Got prediction request with body "{event["body"]}"')
prediction = bento_service_api.handle_aws_lambda_event(event)
print(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is great.

We can take one step further and make this into JSON format with the input and output. Users can stream the logs out to ElasticSearch for searching and/or use the result for training later on.

You can check out the current prediction log format in the BentoAPIServer for reference:

@pep8speaks
Copy link

pep8speaks commented Jun 15, 2020

Hello @jackyzha0, Thanks for updating this PR.

There are currently no PEP 8 issues detected in this PR. Cheers! 🍻

Comment last updated at 2020-06-16 00:39:35 UTC

@parano
Copy link
Member

parano commented Jun 15, 2020

Hi @jackyzha0, looks like CI failed on code formatting check, you will need to run ./dev/format.sh script again to format your code

@@ -45,11 +49,11 @@
if not os.path.exists(bento_bundle_path):
bento_bundle_path = os.path.join('/tmp/requirements', bento_name)

print(f'Loading BentoService bundle from path: "{bento_bundle_path}"')
logger.info('Loading BentoService bundle from path: "%s"', bento_bundle_path)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

how about changing all these to logger.debug and only keep the prediction log in logger.info?

import logging

logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

after import bentoml below, the logging format and level might get reset, so probably don't need this setLevel here.

Could you move logger = logging.getLogger(__name__) to after the from bentoml import load line? and keep the logging message in download_extra_resources just using print?

It is a bit tricky here because bentoml package may not be available to import before the download_extra_resources call, as the package file might be stored in s3 and needs to be downloaded first.

@@ -38,18 +40,21 @@
os.environ['BENTOML_HOME'] = '/tmp/bentoml/'
from bentoml import load # noqa

logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can remove this and hide debug logs by default, those lots are usually only used for BentoML developer/contributor to debug the feature and not so much for BentoML user. For BentoML developer, we can always enable debug log with a simple environment variable change BENTOML__LOGGING__LOGGING_LEVEL=debug

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the default logging level for BentoML? Is it info? When I don't explicitly set the logging level to be DEBUG or INFO, my logger.info() statements don't show up

Copy link
Member

@parano parano Jun 16, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The default is INFO https://github.com/bentoml/BentoML/blob/master/bentoml/configuration/default_bentoml.cfg#L30

I just realized this will not be using the logger set for BentoML module because lambda_app.py will be copy'd to a separate directory when deployed to AWS lambda. To use BentoML's logging config here, you will need to use something like logger = logging.getLogger('bentoml.lambda_app')

@codecov
Copy link

codecov bot commented Jun 16, 2020

Codecov Report

Merging #790 into master will increase coverage by 0.14%.
The diff coverage is 86.66%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #790      +/-   ##
==========================================
+ Coverage   55.75%   55.89%   +0.14%     
==========================================
  Files         114      114              
  Lines        8346     8400      +54     
==========================================
+ Hits         4653     4695      +42     
- Misses       3693     3705      +12     
Impacted Files Coverage Δ
bentoml/yatai/deployment/aws_lambda/operator.py 58.73% <ø> (ø)
bentoml/yatai/deployment/aws_lambda/utils.py 27.77% <66.66%> (ø)
bentoml/yatai/deployment/aws_lambda/lambda_app.py 89.47% <91.66%> (-0.85%) ⬇️
bentoml/saved_bundle/config.py 90.74% <0.00%> (-4.11%) ⬇️
bentoml/yatai/repository/metadata_store.py 66.03% <0.00%> (-1.68%) ⬇️
bentoml/adapters/dataframe_input.py 80.00% <0.00%> (-0.44%) ⬇️
bentoml/handlers/__init__.py 100.00% <0.00%> (ø)
bentoml/adapters/base_input.py 64.40% <0.00%> (ø)
bentoml/configuration/__init__.py 78.37% <0.00%> (+0.60%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 09f51aa...319d340. Read the comment docs.

logger.info('Got prediction request with body "%s"', {event["body"]})
prediction = bento_service_api.handle_aws_lambda_event(event)

logger.debug(
Copy link
Member

@parano parano Jun 16, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

following the comment about logging level above, I think here it should be updated to logger.info, and all others can be set to either logger.debug or logger.error, that way the user can get a very clean and usable prediction log

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would line 76 and line 90 be logger.debug then?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, it's redundant information so I'll just remove those lines

@parano parano added the LGTM label Jun 16, 2020
@parano
Copy link
Member

parano commented Jun 16, 2020

Thanks for updating the PR @jackyzha0, I ran the lambda end-to-end tests on my end and verified the prediction JSON logs showed up on AWS cloud watch, everything looks great! Merging now!

@parano parano merged commit 2f138fc into bentoml:master Jun 16, 2020
aarnphm pushed a commit to aarnphm/BentoML that referenced this pull request Jul 29, 2022
* Repository and Deployment refactor and cleanup (bentoml#771)

* make sagemaker docker image have same file structure

* use consistent file names

* move operator code out of __init__ to avoid loading unused code in model server startup

* refactor deployment validator

* reorganize bento repository code

* deployment valiator test&linting error fix

* more repository code cleanup

* renaming and adding inline comments

* move out lambda operator code to separate file

* fix some typos + adding logging to lambda callback

* fix some linting problems

* fix mock not returning proper form

* add trailing comma

* add better logging + docs

* whitespace fixes

* move logging and consolidate debug line

* update docs + fixed logging

* fix linting issues

* remove redundant logging and switch to bento logger

* update docs

Co-authored-by: Chaoyu <paranoyang@gmail.com>
Co-authored-by: cory <cory.massaro@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Prediction log in AWS Lambda deployment
5 participants