Skip to content

Commit b0f7c9c

Browse files
authored
Support Chinese for Docsum (#960)
Signed-off-by: Xinyao Wang <xinyao.wang@intel.com>
1 parent eeced9b commit b0f7c9c

File tree

3 files changed

+9
-3
lines changed

3 files changed

+9
-3
lines changed

DocSum/README.md

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -158,9 +158,15 @@ Two ways of consuming Document Summarization Service:
158158
1. Use cURL command on terminal
159159

160160
```bash
161+
#Use English mode (default).
161162
curl http://${host_ip}:8888/v1/docsum \
162163
-H "Content-Type: application/json" \
163-
-d '{"messages": "Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5."}'
164+
-d '{"messages": "Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5.","max_tokens":32, "language":"en", "stream":false}'
165+
166+
#Use Chinese mode.
167+
curl http://${host_ip}:8888/v1/docsum \
168+
-H "Content-Type: application/json" \
169+
-d '{"messages": "2024年9月26日,北京——今日,英特尔正式发布英特尔® 至强® 6性能核处理器(代号Granite Rapids),为AI、数据分析、科学计算等计算密集型业务提供卓越性能。","max_tokens":32, "language":"zh", "stream":false}'
164170
```
165171

166172
2. Access via frontend

DocSum/docker_compose/intel/cpu/xeon/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -124,7 +124,7 @@ docker compose up -d
124124

125125
```bash
126126
curl http://${host_ip}:8888/v1/docsum -H "Content-Type: application/json" -d '{
127-
"messages": "Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5."
127+
"messages": "Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5.","max_tokens":32, "language":"en", "stream":false
128128
}'
129129
```
130130

DocSum/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -115,7 +115,7 @@ docker compose up -d
115115

116116
```bash
117117
curl http://${host_ip}:8888/v1/docsum -H "Content-Type: application/json" -d '{
118-
"messages": "Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5."
118+
"messages": "Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5.","max_tokens":32, "language":"en", "stream":false
119119
}'
120120
```
121121

0 commit comments

Comments
 (0)