-
Notifications
You must be signed in to change notification settings - Fork 2.9k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* Add basic 4-component prompting flow
- Loading branch information
Showing
3 changed files
with
166 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,83 @@ | ||
import ThemedImage from "@theme/ThemedImage"; | ||
import useBaseUrl from "@docusaurus/useBaseUrl"; | ||
import ZoomableImage from "/src/theme/ZoomableImage.js"; | ||
import ReactPlayer from "react-player"; | ||
|
||
# Basic prompting | ||
|
||
Prompts serve as the inputs to a large language model (LLM), acting as the interface between human instructions and computational tasks. | ||
|
||
By submitting natural language requests in a prompt to an LLM, you can obtain answers, generate text, and solve problems. | ||
|
||
This article demonstrates how to use Langflow's prompt tools to issue basic prompts to an LLM, and how various prompting strategies can affect your outcomes. | ||
|
||
## Prerequisites | ||
|
||
1. Install Langflow. | ||
```bash | ||
python -m pip install langflow --pre | ||
``` | ||
|
||
2. Start a local Langflow instance with the Langflow CLI: | ||
```bash | ||
langflow run | ||
``` | ||
Or start Langflow with Python: | ||
```bash | ||
python -m langflow run | ||
``` | ||
|
||
Result: | ||
``` | ||
│ Welcome to ⛓ Langflow │ | ||
│ │ | ||
│ Access http://127.0.0.1:7860 │ | ||
│ Collaborate, and contribute at our GitHub Repo 🚀 │ | ||
``` | ||
|
||
Alternatively, go to [HuggingFace Spaces](https://docs.langflow.org/getting-started/hugging-face-spaces) or [Lightning.ai Studio](https://lightning.ai/ogabrielluiz-8j6t8/studios/langflow) for a pre-built Langflow test environment. | ||
|
||
3. Create an [OpenAI API key](https://platform.openai.com). | ||
|
||
## Create the basic prompting project | ||
|
||
1. From the Langflow dashboard, click **New Project**. | ||
2. Select **Basic Prompting**. | ||
3. The **Basic Prompting** flow is created. | ||
|
||
<ZoomableImage | ||
alt="Docusaurus themed image" | ||
sources={{ | ||
light: "img/basic-prompting.png", | ||
dark: "img/basic-prompting.png", | ||
}} | ||
style={{ width: "80%", margin: "20px auto" }} | ||
/> | ||
|
||
This flow allows you to chat with the **OpenAI** component via a **Prompt** component. | ||
Examine the **Prompt** component. The **Template** field instructs the LLM to `Answer the user as if you were a pirate.` | ||
This should be interesting... | ||
|
||
4. To create an environment variable for the **OpenAI** component, in the **OpenAI API Key** field, click the **Globe** button, and then click **Add New Variable**. | ||
1. In the **Variable Name** field, enter `openai_api_key`. | ||
2. In the **Value** field, paste your OpenAI API Key (`sk-...`). | ||
3. Click **Save Variable**. | ||
|
||
## Run the basic prompting flow | ||
|
||
1. Click the **Run** button. | ||
The **Interaction Panel** opens, where you can converse with your bot. | ||
2. Type a message and press Enter. | ||
The bot responds in a markedly piratical manner! | ||
|
||
## Modify the prompt for a different result | ||
|
||
1. To modify your prompt results, in the **Prompt** template, click the **Template** field. | ||
The **Edit Prompt** window opens. | ||
2. Change `Answer the user as if you were a pirate` to a different character, perhaps `Answer the user as if you were Harold Abelson.` | ||
3. Run the basic prompting flow again. | ||
The response will be markedly different. | ||
|
||
|
||
|
||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,83 @@ | ||
import ThemedImage from "@theme/ThemedImage"; | ||
import useBaseUrl from "@docusaurus/useBaseUrl"; | ||
import ZoomableImage from "/src/theme/ZoomableImage.js"; | ||
import ReactPlayer from "react-player"; | ||
|
||
# Basic prompting | ||
|
||
Prompts serve as the inputs to a large language model (LLM), acting as the interface between human instructions and computational tasks. | ||
|
||
By submitting natural language requests in a prompt to an LLM, you can obtain answers, generate text, and solve problems. | ||
|
||
This article demonstrates how to use Langflow's prompt tools to issue basic prompts to an LLM, and how various prompting strategies can affect your outcomes. | ||
|
||
## Prerequisites | ||
|
||
1. Install Langflow. | ||
```bash | ||
python -m pip install langflow --pre | ||
``` | ||
|
||
2. Start a local Langflow instance with the Langflow CLI: | ||
```bash | ||
langflow run | ||
``` | ||
Or start Langflow with Python: | ||
```bash | ||
python -m langflow run | ||
``` | ||
|
||
Result: | ||
``` | ||
│ Welcome to ⛓ Langflow │ | ||
│ │ | ||
│ Access http://127.0.0.1:7860 │ | ||
│ Collaborate, and contribute at our GitHub Repo 🚀 │ | ||
``` | ||
|
||
Alternatively, go to [HuggingFace Spaces](https://docs.langflow.org/getting-started/hugging-face-spaces) or [Lightning.ai Studio](https://lightning.ai/ogabrielluiz-8j6t8/studios/langflow) for a pre-built Langflow test environment. | ||
|
||
3. Create an [OpenAI API key](https://platform.openai.com). | ||
|
||
## Create the basic prompting project | ||
|
||
1. From the Langflow dashboard, click **New Project**. | ||
2. Select **Basic Prompting**. | ||
3. The **Basic Prompting** flow is created. | ||
|
||
<ZoomableImage | ||
alt="Docusaurus themed image" | ||
sources={{ | ||
light: "img/basic-prompting.png", | ||
dark: "img/basic-prompting.png", | ||
}} | ||
style={{ width: "80%", margin: "20px auto" }} | ||
/> | ||
|
||
This flow allows you to chat with the **OpenAI** component via a **Prompt** component. | ||
Examine the **Prompt** component. The **Template** field instructs the LLM to `Answer the user as if you were a pirate.` | ||
This should be interesting... | ||
|
||
4. To create an environment variable for the **OpenAI** component, in the **OpenAI API Key** field, click the **Globe** button, and then click **Add New Variable**. | ||
1. In the **Variable Name** field, enter `openai_api_key`. | ||
2. In the **Value** field, paste your OpenAI API Key (`sk-...`). | ||
3. Click **Save Variable**. | ||
|
||
## Run the basic prompting flow | ||
|
||
1. Click the **Run** button. | ||
The **Interaction Panel** opens, where you can converse with your bot. | ||
2. Type a message and press Enter. | ||
The bot responds in a markedly piratical manner! | ||
|
||
## Modify the prompt for a different result | ||
|
||
1. To modify your prompt results, in the **Prompt** template, click the **Template** field. | ||
The **Edit Prompt** window opens. | ||
2. Change `Answer the user as if you were a pirate` to a different character, perhaps `Answer the user as if you were Harold Abelson.` | ||
3. Run the basic prompting flow again. | ||
The response will be markedly different. | ||
|
||
|
||
|
||
|
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.