Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

upgrade dockerfile to py38 #1435

Merged
merged 6 commits into from
Feb 18, 2022
Merged

upgrade dockerfile to py38 #1435

merged 6 commits into from
Feb 18, 2022

Conversation

lxning
Copy link
Collaborator

@lxning lxning commented Feb 16, 2022

Description

Please include a summary of the feature or issue being fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.

Fixes #(issue)
#1432

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Feature/Issue validation/testing

Please describe the tests [UT/IT] that you ran to verify your changes and relevent result summary. Provide instructions so it can be reproduced.
Please also list any relevant details for your test configuration.

  • Test A

  • Test B

  • UT/IT execution results

  • Logs
    ./build_image.sh -g -cv cu102
    [+] Building 23.7s (23/23) FINISHED
    => [internal] load build definition from Dockerfile 0.0s
    => => transferring dockerfile: 4.36kB 0.0s
    => [internal] load .dockerignore 0.0s
    => => transferring context: 2B 0.0s
    => resolve image config for docker.io/docker/dockerfile:experi 0.2s
    => CACHED docker-image://docker.io/docker/dockerfile:experimen 0.0s
    => [internal] load metadata for docker.io/nvidia/cuda:10.2-cud 0.1s
    => [internal] load build context 0.0s
    => => transferring context: 80B 0.0s
    => [compile-image 1/8] FROM docker.io/nvidia/cuda:10.2-cudnn7- 0.0s
    => CACHED [compile-image 2/8] RUN --mount=type=cache,id=apt-de 0.0s
    => CACHED [compile-image 3/8] RUN update-alternatives --instal 0.0s
    => CACHED [compile-image 4/8] RUN python3.8 -m venv /home/venv 0.0s
    => CACHED [compile-image 5/8] RUN python -m pip install -U pip 0.0s
    => CACHED [compile-image 6/8] RUN export USE_CUDA=1 0.0s
    => CACHED [compile-image 7/8] RUN TORCH_VER=$(curl --silent -- 0.0s
    => CACHED [compile-image 8/8] RUN python -m pip install -U set 0.0s
    => CACHED [runtime-image 2/9] RUN --mount=type=cache,target=/v 0.0s
    => CACHED [runtime-image 3/9] RUN useradd -m model-server 0.0s
    => [runtime-image 4/9] COPY --chown=model-server --from=compi 10.3s
    => [runtime-image 5/9] COPY dockerd-entrypoint.sh /usr/local/b 0.0s
    => [runtime-image 6/9] RUN chmod +x /usr/local/bin/dockerd-ent 0.3s
    => [runtime-image 7/9] COPY config.properties /home/model-serv 0.0s
    => [runtime-image 8/9] RUN mkdir /home/model-server/model-stor 0.4s
    => [runtime-image 9/9] WORKDIR /home/model-server 0.0s
    => exporting to image 12.0s
    => => exporting layers 12.0s
    => => writing image sha256:ad936561d7f9f00f1b39635fbd9b9c84e1f 0.0s
    => => naming to docker.io/pytorch/torchserve:latest-gpu 0.0s

docker run --rm -it -p 8080:8080 -p 8081:8081 -p 8082:8082 -p 7070:7070 -p 7071:7071 pytorch/torchserve:latest-gpu
WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance.
2022-02-16T04:44:05,444 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager - Initializing plugins manager...
2022-02-16T04:44:05,574 [INFO ] main org.pytorch.serve.ModelServer -
Torchserve version: 0.5.2
TS Home: /home/venv/lib/python3.8/site-packages
Current directory: /home/model-server
Temp directory: /home/model-server/tmp
Number of GPUs: 0
Number of CPUs: 32
Max heap size: 30688 M
Python executable: /home/venv/bin/python
Config file: /home/model-server/config.properties
Inference address: http://0.0.0.0:8080
Management address: http://0.0.0.0:8081
Metrics address: http://0.0.0.0:8082
Model Store: /home/model-server/model-store
Initial Models: N/A
Log dir: /home/model-server/logs
Metrics dir: /home/model-server/logs
Netty threads: 32
Netty client threads: 0
Default workers per model: 32
Blacklist Regex: N/A
Maximum Response Size: 6553500
Maximum Request Size: 6553500
Limit Maximum Image Pixels: true
Prefer direct buffer: false
Allowed Urls: [file://.|http(s)?://.]
Custom python dependency for model allowed: false
Metrics report format: prometheus
Enable metrics API: true
Workflow Store: /home/model-server/model-store

Checklist:

Test results for docker dev
docker run --rm -it --gpus '"device=1,2"' -p 8080:8080 -p 8081:8081 -p 8082:8082 -p 7070:7070 -p 7071:7071 pytorch/torchserve:dev-gpu
WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance.
2022-02-18T01:21:18,649 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager - Initializing plugins manager...
2022-02-18T01:21:18,802 [INFO ] main org.pytorch.serve.ModelServer -
Torchserve version: 0.5.2
TS Home: /home/venv/lib/python3.8/site-packages
Current directory: /home/model-server
Temp directory: /home/model-server/tmp
Number of GPUs: 2
Number of CPUs: 32
Max heap size: 30688 M
Python executable: /home/venv/bin/python
Config file: /home/model-server/config.properties
Inference address: http://0.0.0.0:8080
Management address: http://0.0.0.0:8081
Metrics address: http://0.0.0.0:8082
Model Store: /home/model-server/model-store
Initial Models: N/A
Log dir: /home/model-server/logs
Metrics dir: /home/model-server/logs
Netty threads: 32
Netty client threads: 0
Default workers per model: 2
Blacklist Regex: N/A
Maximum Response Size: 6553500
Maximum Request Size: 6553500
Limit Maximum Image Pixels: true
Prefer direct buffer: false
Allowed Urls: [file://.|http(s)?://.]
Custom python dependency for model allowed: false
Metrics report format: prometheus
Enable metrics API: true
Workflow Store: /home/model-server/model-store
Model config: N/A
2022-02-18T01:21:18,809 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager - Loading snapshot serializer plugin...
2022-02-18T01:21:18,839 [INFO ] main org.pytorch.serve.ModelServer - Initialize Inference server with: EpollServerSocketChannel.
2022-02-18T01:21:18,913 [INFO ] main org.pytorch.serve.ModelServer - Inference API bind to: http://0.0.0.0:8080
2022-02-18T01:21:18,913 [INFO ] main org.pytorch.serve.ModelServer - Initialize Management server with: EpollServerSocketChannel.
2022-02-18T01:21:18,914 [INFO ] main org.pytorch.serve.ModelServer - Management API bind to: http://0.0.0.0:8081
2022-02-18T01:21:18,914 [INFO ] main org.pytorch.serve.ModelServer - Initialize Metrics server with: EpollServerSocketChannel.
2022-02-18T01:21:18,916 [INFO ] main org.pytorch.se

  • Have you added tests that prove your fix is effective or that this feature works?
  • New and existing unit tests pass locally with these changes?
  • Has code been commented, particularly in hard-to-understand areas?
  • Have you made corresponding changes to the documentation?

Notes: all current docker status

./docker/Dockerfile : upgrade to py38 in this PR.
./docker/Dockerfile.dev : upgrade to py38 in this PR.
./docker/Dockerfile.neuron.dev : will deprecate in separate PR, so no upgrade
./docker/Dockerfile.benchmark : will deprecate in separate PR, so no upgrade
./kubernetes/kserve/Dockerfile : base on ./docker/Dockerfile, no need to update
./kubernetes/kserve/Dockerfile.dev : no issue

@lxning lxning self-assigned this Feb 16, 2022
@lxning lxning added the bug Something isn't working label Feb 16, 2022
@lxning lxning added this to In Review in v0.5.3 lifecycle Feb 16, 2022
@lxning lxning added this to the v0.5.3 milestone Feb 16, 2022
@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-win
  • Commit ID: 15606e5
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-cpu
  • Commit ID: 15606e5
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-gpu
  • Commit ID: 15606e5
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

Copy link
Collaborator

@HamidShojanazeri HamidShojanazeri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @lxning changes might apply to docker dev file as well

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-win
  • Commit ID: 028f3d6
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-cpu
  • Commit ID: 028f3d6
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-gpu
  • Commit ID: 028f3d6
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@maaquib
Copy link
Collaborator

maaquib commented Feb 16, 2022

Thanks @lxning changes might apply to docker dev file as well

We should keep the version consistent in other places as well. install_dependencies , benchmark and kfserving

Copy link
Member

@msaroufim msaroufim left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1 on @maaquib's comment

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-win
  • Commit ID: c417a29
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-cpu
  • Commit ID: c417a29
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-gpu
  • Commit ID: c417a29
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-win
  • Commit ID: cc658a2
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@msaroufim msaroufim self-requested a review February 18, 2022 02:01
@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-cpu
  • Commit ID: cc658a2
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-gpu
  • Commit ID: cc658a2
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-win
  • Commit ID: 528bbca
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-cpu
  • Commit ID: 528bbca
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-gpu
  • Commit ID: 528bbca
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-win
  • Commit ID: 19f5e8d
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-cpu
  • Commit ID: 19f5e8d
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-gpu
  • Commit ID: 19f5e8d
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@maaquib maaquib merged commit fe07ed4 into master Feb 18, 2022
@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-win
  • Commit ID: 19f5e8d
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-neo-ci-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: torch-serve-build-gpu
  • Commit ID: 19f5e8d
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@msaroufim msaroufim changed the title upgrade to py38 upgrade dockerfile to py38 Feb 18, 2022
@msaroufim msaroufim deleted the issue_1432 branch June 16, 2022 01:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Development

Successfully merging this pull request may close these issues.

None yet

5 participants