-
Notifications
You must be signed in to change notification settings - Fork 758
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to change prediction log format #1464
Comments
Hi @gregd33 - have you had a chance to try out the new logging config as documented here? https://docs.bentoml.org/en/latest/guides/logging.html |
I just gave that a try. I ran into an issue with With that, same behaviour - it didn't work |
Just to be sure it wasn't me, I built the IrisClassifier in https://github.com/bentoml/BentoML/blob/master/guides/quick-start/ and ran a Then ran where the logging file was the default with one line changed
I got this in the logs:
When I would only expect service_name. So this should be fully reproducible by anyone. |
@ssheng Any ideas? |
Thanks for reporting, @gregd33. I was able to reproduce the case and confirm the format string got configured correctly. However, it doesn't seem like |
Thanks for the patience, @gregd33. It turned out that the APIs in Python:
Through https://docs.bentoml.org/en/latest/guides/logging.html:
|
Awesome, thanks so much! Would never of figured that out. |
So the primary reason for raising this was that it wasn't working as expected so I wanted to ensure I wasn't doing something wrong / flag it as a bug. My use case was moreso seeing if it was possible rather than having a firm need for it. That said, the two reasons I might have:
It is not necessarily a crucial issue especially if it introduces a lot of complexity to the code. I'm fine if it is not fixed (other than the documentation to reflect that :) ) Alternatively, a simpler, more narrow approach might be to make outputting the result configurable: BentoML/bentoml/service/inference_api.py Line 284 in 33fa8b2
So omitting that line, if some config argument is present would make the change much easier. But then it is config change for a specific thing as oppose to a generic one, so that can be a bad thing too. In short: not a big deal if it isn't fixed. |
Thanks for the feedback. I agree with the two reasons you shared. Given the urgency, the solution in #1478 as a workaround is not ideal. We will take a step back and think about the logging requirements as a whole. |
Describe the bug
I've tried various methods to set
prediction_log_json_format
:bentoml config set
Some of them seem to work at least when looking at
bentoml config view-effective
but none actually give the expected log format.To Reproduce
bentoml config set logging.prediction_log_json_format="%%(service_name)s"
OR in the service itself viabentoml.config().set('logging', 'prediction_log_json_format', "%%(service_name)s")
bentoml config view-effective
should show that it is therebentoml serve-gunicorn --workers 1 MyService:latest --debug
It's worth mentioning that when you run in
debug
mode and you see the config, it doesn't show the one I set (default or otherwise)Expected behavior
Logs should be of the format I specify
Screenshots/Logs
Environment:
Additional context
The service itself is using
ImageInput
andJsonOutput
if that mattersI've confirmed that in
https://github.com/bentoml/BentoML/blob/4c825fd359267ee333925514e144ef1b39e5f47e/bentoml/utils/log.py#L32
it sees the right format.The text was updated successfully, but these errors were encountered: