Skip to content

Commit bc4bbfa

Browse files
Fix issues with the VisualQnA instructions (#809)
Signed-off-by: Dina Suehiro Jones <dina.s.jones@intel.com> Signed-off-by: dmsuehir <dina.s.jones@intel.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
1 parent edcc50f commit bc4bbfa

File tree

3 files changed

+23
-13
lines changed

3 files changed

+23
-13
lines changed

VisualQnA/Dockerfile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ RUN pip install --no-cache-dir --upgrade pip && \
2323

2424
COPY ./visualqna.py /home/user/visualqna.py
2525

26-
ENV PYTHONPATH=$PYTHONPATH:/home/user/GenAIComps
26+
ENV PYTHONPATH=/home/user/GenAIComps
2727

2828
USER user
2929

VisualQnA/docker_compose/intel/cpu/xeon/README.md

Lines changed: 10 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -41,10 +41,12 @@ git clone https://github.com/opea-project/GenAIComps.git
4141
cd GenAIComps
4242
```
4343

44-
### 1. Build LVM Image
44+
### 1. Build LVM and NGINX Docker Images
4545

4646
```bash
4747
docker build --no-cache -t opea/lvm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/lvms/tgi-llava/Dockerfile .
48+
49+
docker build --no-cache -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/nginx/Dockerfile .
4850
```
4951

5052
### 2. Build MegaService Docker Image
@@ -55,7 +57,7 @@ To construct the Mega Service, we utilize the [GenAIComps](https://github.com/op
5557
git clone https://github.com/opea-project/GenAIExamples.git
5658
cd GenAIExamples/VisualQnA
5759
docker build --no-cache -t opea/visualqna:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f Dockerfile .
58-
cd ../../..
60+
cd ../..
5961
```
6062

6163
### 3. Build UI Docker Image
@@ -65,7 +67,7 @@ Build frontend Docker image via below command:
6567
```bash
6668
cd GenAIExamples/VisualQnA/ui
6769
docker build --no-cache -t opea/visualqna-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f docker/Dockerfile .
68-
cd ../../../..
70+
cd ../../..
6971
```
7072

7173
### 4. Pull TGI Xeon Image
@@ -74,12 +76,13 @@ cd ../../../..
7476
docker pull ghcr.io/huggingface/text-generation-inference:sha-e4201f4-intel-cpu
7577
```
7678

77-
Then run the command `docker images`, you will have the following 4 Docker Images:
79+
Then run the command `docker images`, you will have the following 5 Docker Images:
7880

7981
1. `ghcr.io/huggingface/text-generation-inference:sha-e4201f4-intel-cpu`
8082
2. `opea/lvm-tgi:latest`
8183
3. `opea/visualqna:latest`
8284
4. `opea/visualqna-ui:latest`
85+
5. `opea/nginx`
8386

8487
## 🚀 Start Microservices
8588

@@ -98,7 +101,7 @@ export host_ip="External_Public_IP"
98101
**Append the value of the public IP address to the no_proxy list**
99102

100103
```
101-
export your_no_proxy=${your_no_proxy},"External_Public_IP"
104+
export your_no_proxy="${your_no_proxy},${host_ip}"
102105
```
103106

104107
```bash
@@ -131,6 +134,8 @@ docker compose -f compose.yaml up -d
131134

132135
Follow the instructions to validate MicroServices.
133136

137+
> Note: If you see an "Internal Server Error" from the `curl` command, wait a few minutes for the microserver to be ready and then try again.
138+
134139
1. LLM Microservice
135140

136141
```bash

VisualQnA/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 12 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -13,10 +13,12 @@ git clone https://github.com/opea-project/GenAIComps.git
1313
cd GenAIComps
1414
```
1515

16-
### 2. Build LLM Image
16+
### 2. Build LVM and NGINX Docker Images
1717

1818
```bash
1919
docker build --no-cache -t opea/lvm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/lvms/tgi-llava/Dockerfile .
20+
21+
docker build --no-cache -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/nginx/Dockerfile .
2022
```
2123

2224
### 3. Pull TGI Gaudi Image
@@ -31,27 +33,28 @@ To construct the Mega Service, we utilize the [GenAIComps](https://github.com/op
3133

3234
```bash
3335
git clone https://github.com/opea-project/GenAIExamples.git
34-
cd GenAIExamples/VisualQnA/docker
36+
cd GenAIExamples/VisualQnA
3537
docker build --no-cache -t opea/visualqna:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f Dockerfile .
36-
cd ../../..
38+
cd ../..
3739
```
3840

3941
### 5. Build UI Docker Image
4042

4143
Build frontend Docker image via below command:
4244

4345
```bash
44-
cd GenAIExamples/VisualQnA//
46+
cd GenAIExamples/VisualQnA/ui
4547
docker build --no-cache -t opea/visualqna-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f ./docker/Dockerfile .
46-
cd ../../../..
48+
cd ../../..
4749
```
4850

49-
Then run the command `docker images`, you will have the following 4 Docker Images:
51+
Then run the command `docker images`, you will have the following 5 Docker Images:
5052

51-
1. `opea/llava-tgi:latest`
53+
1. `ghcr.io/huggingface/tgi-gaudi:2.0.4`
5254
2. `opea/lvm-tgi:latest`
5355
3. `opea/visualqna:latest`
5456
4. `opea/visualqna-ui:latest`
57+
5. `opea/nginx`
5558

5659
## 🚀 Start MicroServices and MegaService
5760

@@ -89,6 +92,8 @@ docker compose -f compose.yaml up -d
8992

9093
Follow the instructions to validate MicroServices.
9194

95+
> Note: If you see an "Internal Server Error" from the `curl` command, wait a few minutes for the microserver to be ready and then try again.
96+
9297
1. LLM Microservice
9398

9499
```bash

0 commit comments

Comments
 (0)