Skip to content

Commit

Permalink
[no ci] minor edits
Browse files Browse the repository at this point in the history
  • Loading branch information
amaiya committed Apr 1, 2023
1 parent baf426c commit ed38b6e
Showing 1 changed file with 7 additions and 5 deletions.
12 changes: 7 additions & 5 deletions examples/text/generative_ai_example.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
"\n",
"*ktrain* supports a Generative AI module that is currently based on an instruction-fine-tuned version of GPT-J. Think of it as a lightweight version of ChatGPT that can be run locally on your own machine. As a smaller model, it will not perform as well as GPT-4, ChatGPT, etc. However, since it does not communicate with external APIs like OpenAI, it can be used with non-public data.\n",
"\n",
"The model requires a GPU with at least 16GB of GPU memory or VRAM. If you have less than this, you can use a CPU (provided it has at least 16GB of RAM), but output will be generated very slowly (depending on the number of CPU cores). We will use a CPU in this example, but you should supply `device=cuda` if you have a GPU with at least 16GB of GPU memory."
"The model requires a GPU with at least 16GB of GPU memory or VRAM. If you have less than this, you can use a CPU (provided it has at least 16GB of RAM), but output will be generated very slowly (depending on the number of CPU cores). We will use a CPU in this example, but you should either leave blank or explicitly supply `device=cuda` if you have a GPU with at least 16GB of GPU memory."
]
},
{
Expand All @@ -28,14 +28,16 @@
"outputs": [],
"source": [
"from ktrain.text.generative_ai import GenerativeAI\n",
"model = GenerativeAI(device='cpu') # use device='cuda' if you have a good GPU!"
"model = GenerativeAI(device='cpu') # leave blank or use device='cuda' if you have a good GPU!"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Since this model is instruction-fine-tuned, you should supply prompts in the form of instructions of what you want the model to do for you."
"Since this model is instruction-fine-tuned, you should supply prompts in the form of instructions of what you want the model to do for you. \n",
"\n",
"**Tip**: Due to the way the model was trained, you should only use newlines to separate cohesive units of information fed to the model. This is illustrated through various examples below. If the model encounters a misplaced newline character it doesn't like, it may output nothing."
]
},
{
Expand Down Expand Up @@ -226,7 +228,7 @@
"metadata": {},
"source": [
"#### Paraphrasing\n",
"**Pro-Tip**: Do not embed any newlines in the text you're paraphrasing or it will confuse the model."
"**Pro-Tip**: Remove any embedded newlines in the text you're paraphrasing or it will confuse the model."
]
},
{
Expand Down Expand Up @@ -415,7 +417,7 @@
"[Tweet]: \n",
"Startups should not worry about how to put out fires, they should worry about how to start them.\n",
"###\n",
"[Keyword]: http://localhost:7999/notebooks/examples/text/generative_ai_example.ipynb#\n",
"[Keyword]:\n",
"climate change\n",
"[Tweet]:\"\"\"\n",
"print(model.execute(prompt))"
Expand Down

0 comments on commit ed38b6e

Please sign in to comment.