Skip to content

Commit 0c6b044

Browse files
authored
Update FaqGen README.md for its workflow (#910)
Signed-off-by: Tsai, Louie <louie.tsai@intel.com>
1 parent d23cd79 commit 0c6b044

File tree

1 file changed

+49
-0
lines changed

1 file changed

+49
-0
lines changed

FaqGen/README.md

Lines changed: 49 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,55 @@ In today's data-driven world, organizations across various industries face the c
44

55
Our FAQ Generation Application leverages the power of large language models (LLMs) to revolutionize the way you interact with and comprehend complex textual data. By harnessing cutting-edge natural language processing techniques, our application can automatically generate comprehensive and natural-sounding frequently asked questions (FAQs) from your documents, legal texts, customer queries, and other sources. In this example use case, we utilize LangChain to implement FAQ Generation and facilitate LLM inference using Text Generation Inference on Intel Xeon and Gaudi2 processors.
66

7+
The FaqGen example is implemented using the component-level microservices defined in [GenAIComps](https://github.com/opea-project/GenAIComps). The flow chart below shows the information flow between different microservices for this example.
8+
9+
```mermaid
10+
---
11+
config:
12+
flowchart:
13+
nodeSpacing: 400
14+
rankSpacing: 100
15+
curve: linear
16+
themeVariables:
17+
fontSize: 50px
18+
---
19+
flowchart LR
20+
%% Colors %%
21+
classDef blue fill:#ADD8E6,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5
22+
classDef orange fill:#FBAA60,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5
23+
classDef orchid fill:#C26DBC,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5
24+
classDef invisible fill:transparent,stroke:transparent;
25+
style FaqGen-MegaService stroke:#000000
26+
27+
%% Subgraphs %%
28+
subgraph FaqGen-MegaService["FaqGen MegaService "]
29+
direction LR
30+
LLM([LLM MicroService]):::blue
31+
end
32+
subgraph UserInterface[" User Interface "]
33+
direction LR
34+
a([User Input Query]):::orchid
35+
UI([UI server<br>]):::orchid
36+
end
37+
38+
39+
LLM_gen{{LLM Service <br>}}
40+
GW([FaqGen GateWay<br>]):::orange
41+
42+
43+
%% Questions interaction
44+
direction LR
45+
a[User Input Query] --> UI
46+
UI --> GW
47+
GW <==> FaqGen-MegaService
48+
49+
50+
%% Embedding service flow
51+
direction LR
52+
LLM <-.-> LLM_gen
53+
54+
```
55+
756
## Deploy FAQ Generation Service
857

958
The FAQ Generation service can be deployed on either Intel Gaudi2 or Intel Xeon Scalable Processors.

0 commit comments

Comments
 (0)