@@ -41,10 +41,12 @@ git clone https://github.com/opea-project/GenAIComps.git
41
41
cd GenAIComps
42
42
```
43
43
44
- ### 1. Build LVM Image
44
+ ### 1. Build LVM and NGINX Docker Images
45
45
46
46
``` bash
47
47
docker build --no-cache -t opea/lvm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/lvms/tgi-llava/Dockerfile .
48
+
49
+ docker build --no-cache -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/nginx/Dockerfile .
48
50
```
49
51
50
52
### 2. Build MegaService Docker Image
@@ -55,7 +57,7 @@ To construct the Mega Service, we utilize the [GenAIComps](https://github.com/op
55
57
git clone https://github.com/opea-project/GenAIExamples.git
56
58
cd GenAIExamples/VisualQnA
57
59
docker build --no-cache -t opea/visualqna:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f Dockerfile .
58
- cd ../../..
60
+ cd ../..
59
61
```
60
62
61
63
### 3. Build UI Docker Image
@@ -65,7 +67,7 @@ Build frontend Docker image via below command:
65
67
``` bash
66
68
cd GenAIExamples/VisualQnA/ui
67
69
docker build --no-cache -t opea/visualqna-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f docker/Dockerfile .
68
- cd ../../../..
70
+ cd ../../..
69
71
```
70
72
71
73
### 4. Pull TGI Xeon Image
@@ -74,12 +76,13 @@ cd ../../../..
74
76
docker pull ghcr.io/huggingface/text-generation-inference:sha-e4201f4-intel-cpu
75
77
```
76
78
77
- Then run the command ` docker images ` , you will have the following 4 Docker Images:
79
+ Then run the command ` docker images ` , you will have the following 5 Docker Images:
78
80
79
81
1 . ` ghcr.io/huggingface/text-generation-inference:sha-e4201f4-intel-cpu `
80
82
2 . ` opea/lvm-tgi:latest `
81
83
3 . ` opea/visualqna:latest `
82
84
4 . ` opea/visualqna-ui:latest `
85
+ 5 . ` opea/nginx `
83
86
84
87
## 🚀 Start Microservices
85
88
@@ -98,7 +101,7 @@ export host_ip="External_Public_IP"
98
101
** Append the value of the public IP address to the no_proxy list**
99
102
100
103
```
101
- export your_no_proxy=${your_no_proxy},"External_Public_IP "
104
+ export your_no_proxy=" ${your_no_proxy},${host_ip} "
102
105
```
103
106
104
107
``` bash
@@ -131,6 +134,8 @@ docker compose -f compose.yaml up -d
131
134
132
135
Follow the instructions to validate MicroServices.
133
136
137
+ > Note: If you see an "Internal Server Error" from the ` curl ` command, wait a few minutes for the microserver to be ready and then try again.
138
+
134
139
1 . LLM Microservice
135
140
136
141
``` bash
0 commit comments