Skip to content

Commit 5ac77f7

Browse files
authored
fix faqgen on xeon test scripts (#552)
Signed-off-by: chensuyue <suyue.chen@intel.com>
1 parent ebc165a commit 5ac77f7

File tree

3 files changed

+4
-53
lines changed

3 files changed

+4
-53
lines changed

.github/workflows/VisualQnA.yml

Lines changed: 0 additions & 50 deletions
This file was deleted.

FaqGen/docker/xeon/compose.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ version: "3.8"
66
services:
77
tgi-service:
88
image: ghcr.io/huggingface/text-generation-inference:1.4
9-
container_name: tgi_xeon_server
9+
container_name: tgi-xeon-server
1010
ports:
1111
- "8008:80"
1212
environment:

FaqGen/tests/test_faqgen_on_xeon.sh

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,9 +9,10 @@ LOG_PATH="$WORKPATH/tests"
99
ip_address=$(hostname -I | awk '{print $1}')
1010

1111
function build_docker_images() {
12-
cd $WORKPATH/../../
12+
cd $WORKPATH
1313

1414
git clone https://github.com/opea-project/GenAIComps.git
15+
cd GenAIComps
1516

1617
docker build --no-cache -t opea/llm-faqgen-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/faq-generation/tgi/Dockerfile .
1718

@@ -94,7 +95,7 @@ function validate_microservices() {
9495
"${ip_address}:8008/generate" \
9596
"generated_text" \
9697
"tgi-service" \
97-
"tgi_xeon_server" \
98+
"tgi-xeon-server" \
9899
'{"inputs":"What is Deep Learning?","parameters":{"max_new_tokens":17, "do_sample": true}}'
99100

100101
# llm microservice

0 commit comments

Comments
 (0)