Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Docs] - Basic prompting #1712

Merged
merged 12 commits into from
Apr 26, 2024
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
75 changes: 75 additions & 0 deletions docs/docs/getting-started/basic-prompting.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
import ThemedImage from "@theme/ThemedImage";
import useBaseUrl from "@docusaurus/useBaseUrl";
import ZoomableImage from "/src/theme/ZoomableImage.js";
import ReactPlayer from "react-player";

# Basic prompting

Prompts are the inputs given to a large language model. They are the interface between human instruction and computing tasks.

By submitting natural language requests to an LLM in a prompt, you can answer questions, generate text, and solve problems.

This article will show you how to use Langflow's prompt tools to submit basic prompts to an LLM, and how different prompting strategies can change your results.
mendonk marked this conversation as resolved.
Show resolved Hide resolved

## Prerequisites

1. Install Langflow.
```bash
pip install langflow
```

2. Start a local Langflow instance with the Langflow CLI:
```bash
langflow run
```
Or start Langflow with Python:
```bash
python -m langflow run
```

Result:
```
│ Welcome to ⛓ Langflow │
│ │
│ Access http://127.0.0.1:7860 │
│ Collaborate, and contribute at our GitHub Repo 🚀 │
mendonk marked this conversation as resolved.
Show resolved Hide resolved
```

Alternatively, visit us on [HuggingFace Spaces](https://docs.langflow.org/getting-started/hugging-face-spaces) or [Lightning.ai Studio](https://lightning.ai/ogabrielluiz-8j6t8/studios/langflow) for a pre-built Langflow test environment.
mendonk marked this conversation as resolved.
Show resolved Hide resolved

## Create components
mendonk marked this conversation as resolved.
Show resolved Hide resolved

For this example, you'll build a OpenAI chat flow with four components, and then extend it with prompt templates to see the results.
mendonk marked this conversation as resolved.
Show resolved Hide resolved

<ZoomableImage
alt="Docusaurus themed image"
sources={{
light: "img/basic-prompting.png",
dark: "img/basic-prompting.png",
}}
style={{ width: "80%", margin: "20px auto" }}
/>

1. Create a **ChatOpenAI** component.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Again, I'm confused about what type of topic this is. In the prerequisites you tell people how to install and launch the product, but here you give vague instructions like "Create an LLMChain component". Would a user who just installed this product know how to create a certain type of component?

2. In the OpenAI API Key field, paste your OpenAI API Key (`sk-...`).
3. Create an **LLMChain** component. Connect the LLM input to the ChatOpenAI LLM's output.
4. Create a **ChatPromptTemplate** component. Connect the output to the LLMChain Prompt's input.
5. Create a **SystemMessagePromptTemplate** component. This represents a system message, which tells the model how to behave. The Prompt field can stay as default. Connect it to the input of **ChatPromptTemplate**.
mendonk marked this conversation as resolved.
Show resolved Hide resolved
6. Create a **HumanMessagePromptTemplate** component. This represents a message from the user. In the Prompt field, enter `{text}`. Connect it to the input of **ChatPromptTemplate**.
7. Select the Run icon. LangFlow will check your components for errors and return "Flow is Ready to Run".
mendonk marked this conversation as resolved.
Show resolved Hide resolved
8. Select the Messages icon. A chat window will open to run your prompt.
mendonk marked this conversation as resolved.
Show resolved Hide resolved
Chat with the bot to see how it responds according to the behavior described in Prompt.
9. Change the behavior in the Prompt field of **SystemMessagePromptTemplate** and see what happens - for example, suggest it be an unhelpful, grumpy assistant, and see how the results change.

## Other prompts
mendonk marked this conversation as resolved.
Show resolved Hide resolved

Langflow also has **PromptTemplate** and **ChatMessagePromptTemplate** components.

Connect **PromptTemplate** to the **LLMChain** Prompt output for use as a one-shot prompt.

**ChatMessagePromptTemplate** has a `role` field that can be defined as `system`, `user`, `function`, or `assistant`, replacing the more specific template components you used in the example.





71 changes: 71 additions & 0 deletions docs/docs/guides/basic-prompting.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
import ThemedImage from "@theme/ThemedImage";
import useBaseUrl from "@docusaurus/useBaseUrl";
import ZoomableImage from "/src/theme/ZoomableImage.js";
import ReactPlayer from "react-player";

# Basic prompting
mendonk marked this conversation as resolved.
Show resolved Hide resolved

Prompts are the inputs given to a large language model. They are the interface between human instruction and computing tasks.

By submitting natural language requests to an LLM in a prompt, you can answer questions, generate text, and solve problems.

This article will show you how to use Langflow's prompt tools to submit basic prompts to an LLM, and how different prompting strategies can change your results.

## Prerequisites

1. Install Langflow.
```bash
pip install langflow
```

2. Start a local Langflow instance.
```bash
langflow
```

Result:
```
│ Welcome to ⛓ Langflow │
│ │
│ Access http://127.0.0.1:7860 │
│ Collaborate, and contribute at our GitHub Repo 🚀 │
```

Alternatively, visit us on [HuggingFace Spaces](https://docs.langflow.org/getting-started/hugging-face-spaces) or [Lightning.ai Studio](https://lightning.ai/ogabrielluiz-8j6t8/studios/langflow) for a pre-built Langflow test environment.

## Create components

For this example, you'll build a OpenAI chat flow with four components, and then extend it with prompt templates to see the results.

<ZoomableImage
alt="Docusaurus themed image"
sources={{
light: "img/basic-prompting.png",
dark: "img/basic-prompting.png",
}}
style={{ width: "80%", margin: "20px auto" }}
/>

1. Create a **ChatOpenAI** component.
2. In the OpenAI API Key field, paste your OpenAI API Key (`sk-...`).
3. Create an **LLMChain** component. Connect the LLM input to the ChatOpenAI LLM's output.
4. Create a **ChatPromptTemplate** component. Connect the output to the LLMChain Prompt's input.
5. Create a **SystemMessagePromptTemplate** component. This represents a system message, which tells the model how to behave. The Prompt field can stay as default. Connect it to the input of **ChatPromptTemplate**.
6. Create a **HumanMessagePromptTemplate** component. This represents a message from the user. In the Prompt field, enter `{text}`. Connect it to the input of **ChatPromptTemplate**.
7. Select the Run icon. LangFlow will check your components for errors and return "Flow is Ready to Run".
8. Select the Messages icon. A chat window will open to run your prompt.
Chat with the bot to see how it responds according to the behavior described in Prompt.
9. Change the behavior in the Prompt field of **SystemMessagePromptTemplate** and see what happens - for example, suggest it be an unhelpful, grumpy assistant, and see how the results change.

## Other prompts

Langflow also has **PromptTemplate** and **ChatMessagePromptTemplate** components.

Connect **PromptTemplate** to the **LLMChain** Prompt output for use as a one-shot prompt.

**ChatMessagePromptTemplate** has a `role` field that can be defined as `system`, `user`, `function`, or `assistant`, replacing the more specific template components you used in the example.





Binary file added docs/static/img/basic-prompting.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.