Skip to content

Commit 06a3ce0

Browse files
authored
.Net: Fixed warning in release pipeline about Docker base image in examples (microsoft#6340)
### Motivation and Context <!-- Thank you for your contribution to the semantic-kernel repo! Please help reviewers and future users, providing the following information: 1. Why is this change required? 2. What problem does it solve? 3. What scenario does it contribute to? 4. If it fixes an open issue, please link to the issue here. --> There is a warning about using `python:3.12` base image in release pipeline for one of our demo applications. I moved the content of Dockerfile to README, so it can be used as an example. ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [x] The code builds clean without any errors or warnings - [x] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts) raises no violations - [x] All unit tests pass, and I have added new tests where possible - [x] I didn't break anyone 😄
1 parent b250109 commit 06a3ce0

File tree

2 files changed

+34
-21
lines changed

2 files changed

+34
-21
lines changed

dotnet/samples/Demos/QualityCheck/README.md

Lines changed: 34 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@
33
This sample provides a practical demonstration how to perform quality check on LLM results for such tasks as text summarization and translation with Semantic Kernel Filters.
44

55
Metrics used in this example:
6+
67
- [BERTScore](https://github.com/Tiiiger/bert_score) - leverages the pre-trained contextual embeddings from BERT and matches words in candidate and reference sentences by cosine similarity.
78
- [BLEU](https://en.wikipedia.org/wiki/BLEU) (BiLingual Evaluation Understudy) - evaluates the quality of text which has been machine-translated from one natural language to another.
89
- [METEOR](https://en.wikipedia.org/wiki/METEOR) (Metric for Evaluation of Translation with Explicit ORdering) - evaluates the similarity between the generated summary and the reference summary, taking into account grammar and semantics.
@@ -14,7 +15,7 @@ In this example, SK Filters call dedicated [server](./python-server/) which is r
1415

1516
## Prerequisites
1617

17-
1. [Python 3.12](https://www.python.org/downloads/)
18+
1. [Python 3.12](https://www.python.org/downloads/)
1819
2. Get [Hugging Face API token](https://huggingface.co/docs/api-inference/en/quicktour#get-your-api-token).
1920
3. Accept conditions to access [Unbabel/wmt22-cometkiwi-da](https://huggingface.co/Unbabel/wmt22-cometkiwi-da) model on Hugging Face portal.
2021

@@ -25,29 +26,34 @@ It's possible to run Python server for task evaluation directly or with Docker.
2526
### Run server
2627

2728
1. Open Python server directory:
29+
2830
```bash
2931
cd python-server
3032
```
3133

3234
2. Create and active virtual environment:
35+
3336
```bash
3437
python -m venv venv
3538
source venv/Scripts/activate # activate on Windows
3639
source venv/bin/activate # activate on Unix/MacOS
3740
```
3841

3942
3. Setup Hugging Face API key:
43+
4044
```bash
4145
pip install "huggingface_hub[cli]"
4246
huggingface-cli login --token <your_token>
4347
```
4448

4549
4. Install dependencies:
50+
4651
```bash
4752
pip install -r requirements.txt
4853
```
4954

5055
5. Run server:
56+
5157
```bash
5258
cd app
5359
uvicorn main:app --port 8080 --reload
@@ -58,18 +64,42 @@ uvicorn main:app --port 8080 --reload
5864
### Run server with Docker
5965

6066
1. Open Python server directory:
67+
6168
```bash
6269
cd python-server
6370
```
6471

65-
2. Create `.env/hf_token.txt` file and put Hugging Face API token in it.
72+
2. Create following `Dockerfile`:
73+
74+
```dockerfile
75+
# syntax=docker/dockerfile:1.2
76+
FROM python:3.12
77+
78+
WORKDIR /code
79+
80+
COPY ./requirements.txt /code/requirements.txt
81+
82+
RUN pip install "huggingface_hub[cli]"
83+
RUN --mount=type=secret,id=hf_token \
84+
huggingface-cli login --token $(cat /run/secrets/hf_token)
85+
86+
RUN pip install cmake
87+
RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt
88+
89+
COPY ./app /code/app
90+
91+
CMD ["fastapi", "run", "app/main.py", "--port", "80"]
92+
```
93+
94+
3. Create `.env/hf_token.txt` file and put Hugging Face API token in it.
95+
96+
4. Build image and run container:
6697

67-
3. Build image and run container:
6898
```bash
6999
docker-compose up --build
70100
```
71101

72-
4. Open `http://localhost:8080/docs` and check available endpoints.
102+
5. Open `http://localhost:8080/docs` and check available endpoints.
73103

74104
## Testing
75105

dotnet/samples/Demos/QualityCheck/python-server/Dockerfile

Lines changed: 0 additions & 17 deletions
This file was deleted.

0 commit comments

Comments
 (0)