Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 1 addition & 5 deletions .github/workflows/docker-compose-e2e.yml
Original file line number Diff line number Diff line change
Expand Up @@ -80,11 +80,7 @@ jobs:
if: cancelled() || failure()
run: |
cd ${{ github.workspace }}/$example/docker/$hardware
container_list=$(cat docker_compose.yaml | grep container_name | cut -d':' -f2)
for container_name in $container_list; do
cid=$(docker ps -aq --filter "name=$container_name")
if [[ ! -z "$cid" ]]; then docker stop $cid && docker rm $cid && sleep 1s; fi
done
docker compose stop && docker compose rm -f
echo y | docker system prune

- name: Publish pipeline artifact
Expand Down
2 changes: 1 addition & 1 deletion AudioQnA/docker/gaudi/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ export LLM_SERVICE_PORT=3007

```bash
cd GenAIExamples/AudioQnA/docker/gaudi/
docker compose -f docker_compose.yaml up -d
docker compose up -d
```

## 🚀 Test MicroServices
Expand Down
2 changes: 1 addition & 1 deletion AudioQnA/docker/xeon/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ export LLM_SERVICE_PORT=3007

```bash
cd GenAIExamples/AudioQnA/docker/xeon/
docker compose -f docker_compose.yaml up -d
docker compose up -d
```

## 🚀 Test MicroServices
Expand Down
18 changes: 7 additions & 11 deletions AudioQnA/tests/test_audioqna_on_gaudi.sh
Original file line number Diff line number Diff line change
Expand Up @@ -58,15 +58,15 @@ function start_services() {
if [[ "$IMAGE_REPO" != "" ]]; then
# Replace the container name with a test-specific name
echo "using image repository $IMAGE_REPO and image tag $IMAGE_TAG"
sed -i "s#image: opea/audioqna:latest#image: opea/audioqna:${IMAGE_TAG}#g" docker_compose.yaml
sed -i "s#image: opea/audioqna-ui:latest#image: opea/audioqna-ui:${IMAGE_TAG}#g" docker_compose.yaml
sed -i "s#image: opea/*#image: ${IMAGE_REPO}opea/#g" docker_compose.yaml
echo "cat docker_compose.yaml"
cat docker_compose.yaml
sed -i "s#image: opea/audioqna:latest#image: opea/audioqna:${IMAGE_TAG}#g" compose.yaml
sed -i "s#image: opea/audioqna-ui:latest#image: opea/audioqna-ui:${IMAGE_TAG}#g" compose.yaml
sed -i "s#image: opea/*#image: ${IMAGE_REPO}opea/#g" compose.yaml
echo "cat compose.yaml"
cat compose.yaml
fi

# Start Docker Containers
docker compose -f docker_compose.yaml up -d
docker compose up -d
n=0
until [[ "$n" -ge 500 ]]; do
docker logs tgi-gaudi-server > $LOG_PATH/tgi_service_start.log
Expand Down Expand Up @@ -125,11 +125,7 @@ function validate_megaservice() {

function stop_docker() {
cd $WORKPATH/docker/gaudi
container_list=$(cat docker_compose.yaml | grep container_name | cut -d':' -f2)
for container_name in $container_list; do
cid=$(docker ps -aq --filter "name=$container_name")
if [[ ! -z "$cid" ]]; then docker stop $cid && docker rm $cid && sleep 1s; fi
done
docker compose stop && docker compose rm -f
}

function main() {
Expand Down
37 changes: 17 additions & 20 deletions AudioQnA/tests/test_audioqna_on_xeon.sh
Original file line number Diff line number Diff line change
Expand Up @@ -54,29 +54,30 @@ function start_services() {
if [[ "$IMAGE_REPO" != "" ]]; then
# Replace the container name with a test-specific name
echo "using image repository $IMAGE_REPO and image tag $IMAGE_TAG"
sed -i "s#image: opea/audioqna:latest#image: opea/audioqna:${IMAGE_TAG}#g" docker_compose.yaml
sed -i "s#image: opea/audioqna-ui:latest#image: opea/audioqna-ui:${IMAGE_TAG}#g" docker_compose.yaml
sed -i "s#image: opea/*#image: ${IMAGE_REPO}opea/#g" docker_compose.yaml
echo "cat docker_compose.yaml"
cat docker_compose.yaml
sed -i "s#image: opea/audioqna:latest#image: opea/audioqna:${IMAGE_TAG}#g" compose.yaml
sed -i "s#image: opea/audioqna-ui:latest#image: opea/audioqna-ui:${IMAGE_TAG}#g" compose.yaml
sed -i "s#image: opea/*#image: ${IMAGE_REPO}opea/#g" compose.yaml
echo "cat compose.yaml"
cat compose.yaml
fi

# Start Docker Containers
docker compose -f docker_compose.yaml up -d
n=0
until [[ "$n" -ge 200 ]]; do
docker logs tgi-service > $LOG_PATH/tgi_service_start.log
if grep -q Connected $LOG_PATH/tgi_service_start.log; then
break
fi
sleep 1s
n=$((n+1))
done
docker compose up -d
n=0
until [[ "$n" -ge 200 ]]; do
docker logs tgi-service > $LOG_PATH/tgi_service_start.log
if grep -q Connected $LOG_PATH/tgi_service_start.log; then
break
fi
sleep 1s
n=$((n+1))
done
}


function validate_megaservice() {
result=$(http_proxy="" curl http://${ip_address}:3008/v1/audioqna -XPOST -d '{"audio": "UklGRigAAABXQVZFZm10IBIAAAABAAEARKwAAIhYAQACABAAAABkYXRhAgAAAAEA", "max_tokens":64}' -H 'Content-Type: application/json')
echo $result
if [[ $result == *"AAA"* ]]; then
echo "Result correct."
else
Expand Down Expand Up @@ -113,11 +114,7 @@ function validate_megaservice() {

function stop_docker() {
cd $WORKPATH/docker/xeon
container_list=$(cat docker_compose.yaml | grep container_name | cut -d':' -f2)
for container_name in $container_list; do
cid=$(docker ps -aq --filter "name=$container_name")
if [[ ! -z "$cid" ]]; then docker stop $cid && docker rm $cid && sleep 1s; fi
done
docker compose stop && docker compose rm -f
}

function main() {
Expand Down
16 changes: 8 additions & 8 deletions ChatQnA/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ docker pull opea/chatqna:latest

Two type of UI are supported now, choose one you like and pull the referred docker image.

If you choose conversational UI, follow the [instruction](https://github.com/opea-project/GenAIExamples/tree/main/ChatQnA/docker/gaudi#-launch-the-conversational-ui-optional) and modify the [docker_compose.yaml](./docker/xeon/docker_compose.yaml).
If you choose conversational UI, follow the [instruction](https://github.com/opea-project/GenAIExamples/tree/main/ChatQnA/docker/gaudi#-launch-the-conversational-ui-optional) and modify the [compose.yaml](./docker/xeon/compose.yaml).

```bash
docker pull opea/chatqna-ui:latest
Expand Down Expand Up @@ -74,11 +74,11 @@ source ./docker/gpu/set_env.sh

## Deploy ChatQnA on Gaudi

Please find corresponding [docker_compose.yaml](./docker/gaudi/docker_compose.yaml).
Please find corresponding [compose.yaml](./docker/gaudi/compose.yaml).

```bash
cd GenAIExamples/ChatQnA/docker/gaudi/
docker compose -f docker_compose.yaml up -d
docker compose up -d
```

> Notice: Currently only the <b>Habana Driver 1.16.x</b> is supported for Gaudi.
Expand All @@ -87,11 +87,11 @@ Please refer to the [Gaudi Guide](./docker/gaudi/README.md) to build docker imag

## Deploy ChatQnA on Xeon

Please find corresponding [docker_compose.yaml](./docker/xeon/docker_compose.yaml).
Please find corresponding [compose.yaml](./docker/xeon/compose.yaml).

```bash
cd GenAIExamples/ChatQnA/docker/xeon/
docker compose -f docker_compose.yaml up -d
docker compose up -d
```

Refer to the [Xeon Guide](./docker/xeon/README.md) for more instructions on building docker images from source.
Expand All @@ -100,7 +100,7 @@ Refer to the [Xeon Guide](./docker/xeon/README.md) for more instructions on buil

```bash
cd GenAIExamples/ChatQnA/docker/gpu/
docker compose -f docker_compose.yaml up -d
docker compose up -d
```

Refer to the [NVIDIA GPU Guide](./docker/gpu/README.md) for more instructions on building docker images from source.
Expand Down Expand Up @@ -153,6 +153,6 @@ If you choose conversational UI, use this URL: `http://{host_ip}:5174`
http_proxy="" curl ${host_ip}:6006/embed -X POST -d '{"inputs":"What is Deep Learning?"}' -H 'Content-Type: application/json'
```

2. (Docker only) If all microservices work well, please check the port ${host_ip}:8888, the port may be allocated by other users, you can modify the `docker_compose.yaml`.
2. (Docker only) If all microservices work well, please check the port ${host_ip}:8888, the port may be allocated by other users, you can modify the `compose.yaml`.

3. (Docker only) If you get errors like "The container name is in use", please change container name in `docker_compose.yaml`.
3. (Docker only) If you get errors like "The container name is in use", please change container name in `compose.yaml`.
2 changes: 1 addition & 1 deletion ChatQnA/chatqna.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ opea_micro_services:
ports: ${RERANK_SERVICE_PORT}
image: opea/reranking-tei:latest
endpoint: /v1/reranking
tgi_service:
tgi-service:
host: ${TGI_SERVICE_IP}
ports: ${TGI_SERVICE_PORT}
image: ghcr.io/huggingface/tgi-gaudi:2.0.1
Expand Down
4 changes: 2 additions & 2 deletions ChatQnA/docker/aipc/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ Then run the command `docker images`, you will have the following 7 Docker Image

### Setup Environment Variables

Since the `docker_compose.yaml` will consume some environment variables, you need to setup them in advance as below.
Since the `compose.yaml` will consume some environment variables, you need to setup them in advance as below.

**Export the value of the public IP address of your AIPC to the `host_ip` environment variable**

Expand Down Expand Up @@ -160,7 +160,7 @@ Note: Please replace with `host_ip` with you external IP address, do not use loc

```bash
cd GenAIExamples/ChatQnA/docker/aipc/
docker compose -f docker_compose.yaml up -d
docker compose up -d

# let ollama service runs
# e.g. ollama run llama3
Expand Down
20 changes: 10 additions & 10 deletions ChatQnA/docker/gaudi/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -177,7 +177,7 @@ If Guardrails docker image is built, you will find one more image:

### Setup Environment Variables

Since the `docker_compose.yaml` will consume some environment variables, you need to setup them in advance as below.
Since the `compose.yaml` will consume some environment variables, you need to setup them in advance as below.

```bash
export no_proxy=${your_no_proxy}
Expand Down Expand Up @@ -227,32 +227,32 @@ cd GenAIExamples/ChatQnA/docker/gaudi/
If use tgi for llm backend.

```bash
docker compose -f docker_compose.yaml up -d
docker compose -f compose.yaml up -d
```

If use vllm for llm backend.

```bash
docker compose -f docker_compose_vllm.yaml up -d
docker compose -f compose_vllm.yaml up -d
```

If use vllm-on-ray for llm backend.

```bash
docker compose -f docker_compose_vllm_ray.yaml up -d
docker compose -f compose_vllm_ray.yaml up -d
```

If use ray serve for llm backend.

```bash
docker compose -f docker_compose_ray_serve.yaml up -d
docker compose -f compose_ray_serve.yaml up -d
```

If you want to enable guardrails microservice in the pipeline, please follow the below command instead:

```bash
cd GenAIExamples/ChatQnA/docker/gaudi/
docker compose -f docker_compose_guardrails.yaml up -d
docker compose -f compose_guardrails.yaml up -d
```

### Validate MicroServices and MegaService
Expand Down Expand Up @@ -426,7 +426,7 @@ curl http://${host_ip}:9090/v1/guardrails\

## Enable LangSmith for Monotoring Application (Optional)

LangSmith offers tools to debug, evaluate, and monitor language models and intelligent agents. It can be used to assess benchmark data for each microservice. Before launching your services with `docker compose -f docker_compose.yaml up -d`, you need to enable LangSmith tracing by setting the `LANGCHAIN_TRACING_V2` environment variable to true and configuring your LangChain API key.
LangSmith offers tools to debug, evaluate, and monitor language models and intelligent agents. It can be used to assess benchmark data for each microservice. Before launching your services with `docker compose -f compose.yaml up -d`, you need to enable LangSmith tracing by setting the `LANGCHAIN_TRACING_V2` environment variable to true and configuring your LangChain API key.

Here's how you can do it:

Expand All @@ -445,7 +445,7 @@ export LANGCHAIN_API_KEY=ls_...

## 🚀 Launch the UI

To access the frontend, open the following URL in your browser: http://{host_ip}:5173. By default, the UI runs on port 5173 internally. If you prefer to use a different host port to access the frontend, you can modify the port mapping in the `docker_compose.yaml` file as shown below:
To access the frontend, open the following URL in your browser: http://{host_ip}:5173. By default, the UI runs on port 5173 internally. If you prefer to use a different host port to access the frontend, you can modify the port mapping in the `compose.yaml` file as shown below:

```yaml
chaqna-gaudi-ui-server:
Expand All @@ -463,7 +463,7 @@ Here is an example of running ChatQnA:

## 🚀 Launch the Conversational UI (Optional)

To access the Conversational UI (react based) frontend, modify the UI service in the `docker_compose.yaml` file. Replace `chaqna-gaudi-ui-server` service with the `chatqna-gaudi-conversation-ui-server` service as per the config below:
To access the Conversational UI (react based) frontend, modify the UI service in the `compose.yaml` file. Replace `chaqna-gaudi-ui-server` service with the `chatqna-gaudi-conversation-ui-server` service as per the config below:

```yaml
chaqna-gaudi-conversation-ui-server:
Expand All @@ -481,7 +481,7 @@ chaqna-gaudi-conversation-ui-server:
restart: always
```

Once the services are up, open the following URL in your browser: http://{host_ip}:5174. By default, the UI runs on port 80 internally. If you prefer to use a different host port to access the frontend, you can modify the port mapping in the `docker_compose.yaml` file as shown below:
Once the services are up, open the following URL in your browser: http://{host_ip}:5174. By default, the UI runs on port 80 internally. If you prefer to use a different host port to access the frontend, you can modify the port mapping in the `compose.yaml` file as shown below:

```yaml
chaqna-gaudi-conversation-ui-server:
Expand Down
16 changes: 8 additions & 8 deletions ChatQnA/docker/gaudi/how_to_validate_service.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,15 +17,15 @@ start the docker containers

```
cd ./GenAIExamples/ChatQnA/docker/gaudi
docker compose -f ./docker_compose.yaml up -d
docker compose up -d
```

Check the start up log by `docker compose -f ./docker/gaudi/docker_compose.yaml logs`.
Where the docker_compose.yaml file is the mega service docker-compose configuration.
Check the start up log by `docker compose -f ./docker/gaudi/compose.yaml logs`.
Where the compose.yaml file is the mega service docker-compose configuration.
The warning messages point out the veriabls are **NOT** set.

```
ubuntu@gaudi-vm:~/GenAIExamples/ChatQnA/docker/gaudi$ docker compose -f ./docker_compose.yaml up -d
ubuntu@gaudi-vm:~/GenAIExamples/ChatQnA/docker/gaudi$ docker compose -f ./compose.yaml up -d
WARN[0000] The "LANGCHAIN_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "LANGCHAIN_TRACING_V2" variable is not set. Defaulting to a blank string.
WARN[0000] The "LANGCHAIN_API_KEY" variable is not set. Defaulting to a blank string.
Expand All @@ -34,7 +34,7 @@ WARN[0000] The "LANGCHAIN_API_KEY" variable is not set. Defaulting to a blank st
WARN[0000] The "LANGCHAIN_TRACING_V2" variable is not set. Defaulting to a blank string.
WARN[0000] The "LANGCHAIN_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "LANGCHAIN_TRACING_V2" variable is not set. Defaulting to a blank string.
WARN[0000] /home/ubuntu/GenAIExamples/ChatQnA/docker/gaudi/docker_compose.yaml: `version` is obsolete
WARN[0000] /home/ubuntu/GenAIExamples/ChatQnA/docker/gaudi/compose.yaml: `version` is obsolete
```

## 2. Check the docker container status
Expand Down Expand Up @@ -118,7 +118,7 @@ Check the log by `docker logs f7a08f9867f9 -t`.

The log indicates the MODLE_ID is not set.

View the docker input parameters in `./ChatQnA/docker/gaudi/docker_compose.yaml`
View the docker input parameters in `./ChatQnA/docker/gaudi/compose.yaml`

```
tgi-service:
Expand Down Expand Up @@ -146,10 +146,10 @@ The input MODEL_ID is `${LLM_MODEL_ID}`
Check environment variable `LLM_MODEL_ID` is set correctly, spelled correctly.
Set the LLM_MODEL_ID then restart the containers.

Also you can check overall logs with the following command, where the docker_compose.yaml is the mega service docker-compose configuration file.
Also you can check overall logs with the following command, where the compose.yaml is the mega service docker-compose configuration file.

```
docker compose -f ./docker-composer/gaudi/docker_compose.yaml logs
docker compose -f ./docker-composer/gaudi/compose.yaml logs
```

## 4. Check each micro service used by the Mega Service
Expand Down
8 changes: 4 additions & 4 deletions ChatQnA/docker/gpu/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ Then run the command `docker images`, you will have the following 7 Docker Image

### Setup Environment Variables

Since the `docker_compose.yaml` will consume some environment variables, you need to setup them in advance as below.
Since the `compose.yaml` will consume some environment variables, you need to setup them in advance as below.

```bash
export no_proxy=${your_no_proxy}
Expand Down Expand Up @@ -110,7 +110,7 @@ Note: Please replace with `host_ip` with you external IP address, do **NOT** use

```bash
cd GenAIExamples/ChatQnA/docker/gpu/
docker compose -f docker_compose.yaml up -d
docker compose up -d
```

### Validate MicroServices and MegaService
Expand Down Expand Up @@ -245,7 +245,7 @@ curl -X POST "http://${host_ip}:6009/v1/dataprep/delete_file" \

## Enable LangSmith for Monotoring Application (Optional)

LangSmith offers tools to debug, evaluate, and monitor language models and intelligent agents. It can be used to assess benchmark data for each microservice. Before launching your services with `docker compose -f docker_compose.yaml up -d`, you need to enable LangSmith tracing by setting the `LANGCHAIN_TRACING_V2` environment variable to true and configuring your LangChain API key.
LangSmith offers tools to debug, evaluate, and monitor language models and intelligent agents. It can be used to assess benchmark data for each microservice. Before launching your services with `docker compose -f compose.yaml up -d`, you need to enable LangSmith tracing by setting the `LANGCHAIN_TRACING_V2` environment variable to true and configuring your LangChain API key.

Here's how you can do it:

Expand All @@ -264,7 +264,7 @@ export LANGCHAIN_API_KEY=ls_...

## 🚀 Launch the UI

To access the frontend, open the following URL in your browser: http://{host_ip}:5173. By default, the UI runs on port 5173 internally. If you prefer to use a different host port to access the frontend, you can modify the port mapping in the `docker_compose.yaml` file as shown below:
To access the frontend, open the following URL in your browser: http://{host_ip}:5173. By default, the UI runs on port 5173 internally. If you prefer to use a different host port to access the frontend, you can modify the port mapping in the `compose.yaml` file as shown below:

```yaml
chaqna-ui-server:
Expand Down
Loading