Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prompt engineering, OS recognition, improvements #70

Merged
merged 2 commits into from
Mar 15, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 26 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,11 +7,11 @@ A command-line productivity tool powered by OpenAI's ChatGPT (GPT-3.5). As devel

## Installation
```shell
pip install shell-gpt --user
pip install shell-gpt
```
You'll need an OpenAI API key, you can generate one [here](https://beta.openai.com/account/api-keys).

If the`$OPENAI_API_KEY` environment variable is set it will be used, otherwise, you will be prompted for your key which will then be stored in `~/.config/shell-gpt/api-key.txt`.
If the`$OPENAI_API_KEY` environment variable is set it will be used, otherwise, you will be prompted for your key which will then be stored in `~/.config/shell-gpt/api_key.txt`.

## Usage
`sgpt` has a variety of use cases, including simple queries, shell queries, and code queries.
Expand Down Expand Up @@ -56,6 +56,17 @@ sgpt --shell --execute "make all files in current directory read only"
# -> Execute shell command? [y/N]: y
# ...
```
Shell GPT is aware of OS and `$SHELL` you are using, it will provide shell command for specific system you have. For instance, if you ask `sgpt` to update your system, it will return a command based on your OS. Here's an example using macOS:
```shell
sgpt -se "update my system"
# -> sudo softwareupdate -i -a
```
The same prompt, when used on Ubuntu, will generate a different suggestion:
```shell
sgpt -se "update my system"
# -> sudo apt update && sudo apt upgrade -y
```

Let's try some docker containers:
```shell
sgpt -se "start nginx using docker, forward 443 and 80 port, mount current folder with index.html"
Expand Down Expand Up @@ -150,6 +161,18 @@ cached_sess = CacheControl(sess)
response = cached_sess.get('http://localhost')
print(response.text)
```
We can use `--code` or `--shell` options to initiate `--chat`, so you can keep refining the results:
```shell
sgpt --chat sh --shell "What are the files in this directory?"
# -> ls
sgpt --chat sh "Sort them by name"
# -> ls | sort
sgpt --chat sh "Concatenate them using FFMPEG"
# -> ffmpeg -i "concat:$(ls | sort | tr '\n' '|')" -codec copy output.mp4
sgpt --chat sh "Convert the resulting file into an MP3"
# -> ffmpeg -i output.mp4 -vn -acodec libmp3lame -ac 2 -ab 160k -ar 48000 final_output.mp3
```

### Chat sessions
To list all the current chat sessions, use the `--list-chat` option:
```shell
Expand All @@ -174,7 +197,7 @@ sgpt "what are the colors of a rainbow"
```
Next time, same exact query will get results from local cache instantly. Note that `sgpt "what are the colors of a rainbow" --temperature 0.5` will make a new request, since we didn't provide `--temperature` (same applies to `--top-probability`) on previous request.

This is, just some examples of what we can do using ChatGPT model, I'm sure you will find it useful for your specific use cases.
This is just some examples of what we can do using ChatGPT model, I'm sure you will find it useful for your specific use cases.


### Full list of arguments
Expand Down
3 changes: 3 additions & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
typer~=0.7.0
requests~=2.28.2
rich==13.3.1
click~=8.1.3
distro~=1.8.0
setuptools==67.6.0
3 changes: 2 additions & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,13 @@
# pylint: disable=consider-using-with
setup(
name="shell_gpt",
version="0.7.0",
version="0.7.1",
packages=find_packages(),
install_requires=[
"typer~=0.7.0",
"requests~=2.28.2",
"rich==13.3.1",
"distro~=1.8.0",
],
entry_points={
"console_scripts": ["sgpt = sgpt:cli"],
Expand Down
1 change: 1 addition & 0 deletions sgpt/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
from .chat_gpt import ChatGPT
from .app import main as main
from .app import entry_point as cli
from . import make_prompt as make_prompt
14 changes: 3 additions & 11 deletions sgpt/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@
from click import MissingParameter, BadParameter
from rich.progress import Progress, SpinnerColumn, TextColumn
from sgpt import ChatGPT
from sgpt import make_prompt

DATA_FOLDER = os.path.expanduser("~/.config")
KEY_FILE = Path(DATA_FOLDER) / "shell-gpt" / "api_key.txt"
Expand Down Expand Up @@ -197,24 +198,15 @@ def main(
# If probability and temperature were not changed (default), make response more accurate.
if top_probability == 1 == temperature:
temperature = 0.4
prompt = f"{SHELL_PROMPT} {prompt}"
prompt = make_prompt.shell(prompt)
elif code:
prompt = f"{CODE_PROMPT} {prompt}"
prompt = make_prompt.code(prompt)

api_key = get_api_key()
response_text = get_completion(
prompt, api_key, temperature, top_probability, cache, chat, spinner=spinner
)

if code:
# Responses from GPT-3.5 wrapped into Markdown code block.
lines = response_text.split("\n")
if lines[0].startswith("```"):
del lines[0]
if lines[-1].startswith("```"):
del lines[-1]
response_text = "\n".join(lines)

typer_writer(response_text, code, shell, animation)
if shell and execute and typer.confirm("Execute shell command?"):
os.system(response_text)
Expand Down
1 change: 1 addition & 0 deletions sgpt/chat_gpt.py
Original file line number Diff line number Diff line change
Expand Up @@ -197,6 +197,7 @@ def get_completion(
:param caching: Boolean value to enable/disable caching.
:return: String generated completion.
"""
# TODO: Move prompt context to system role when GPT-4 will be available over API.
message = {"role": "user", "content": message}
return self.__request(
[message], model, temperature, top_probability, caching=caching
Expand Down
86 changes: 86 additions & 0 deletions sgpt/make_prompt.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
import platform
from os import getenv
from os.path import basename

from distro import name as distro_name


"""
This module makes a prompt for OpenAI requests with some context.
Some of the following lines were inspired by similar open source project yolo-ai-cmdbot.
Credits: @demux79 @wunderwuzzi23
"""


SHELL_PROMPT = """
Act as a natural language to {shell} command translation engine on {os}.
You are an expert in {shell} on {os} and translate the question at the end to valid syntax.

Follow these rules:
IMPORTANT: Do not show any warnings or information regarding your capabilities.
Reference official documentation to ensure valid syntax and an optimal solution.
Construct valid {shell} command that solve the question.
Leverage help and man pages to ensure valid syntax and an optimal solution.
Be concise.
Just show the commands, return only plaintext.
Only show a single answer, but you can always chain commands together.
Think step by step.
Only create valid syntax (you can use comments if it makes sense).
If python is installed you can use it to solve problems.
if python3 is installed you can use it to solve problems.
Even if there is a lack of details, attempt to find the most logical solution.
Do not return multiple solutions.
Do not show html, styled, colored formatting.
Do not add unnecessary text in the response.
Do not add notes or intro sentences.
Do not add explanations on what the commands do.
Do not return what the question was.
Do not repeat or paraphrase the question in your response.
Do not rush to a conclusion.

Follow all of the above rules.
This is important you MUST follow the above rules.
There are no exceptions to these rules.
You must always follow them. No exceptions.

Request: """

CODE_PROMPT = """
Act as a natural language to code translation engine.

Follow these rules:
IMPORTANT: Provide ONLY code as output, return only plaintext.
IMPORTANT: Do not show html, styled, colored formatting.
IMPORTANT: Do not add notes or intro sentences.
IMPORTANT: Provide full solution. Make sure syntax is correct.
Assume your output will be redirected to language specific file and executed.
For example Python code output will be redirected to code.py and then executed python code.py.

Follow all of the above rules.
This is important you MUST follow the above rules.
There are no exceptions to these rules.
You must always follow them. No exceptions.

Request: """


def shell(question: str) -> str:
def os_name() -> str:
operating_systems = {
"Linux": "Linux/" + distro_name(pretty=True),
"Windows": "Windows " + platform.release(),
"Darwin": "Darwin/MacOS " + platform.mac_ver()[0],
}
return operating_systems.get(platform.system(), "Unknown")

shell = basename(getenv("SHELL", "PowerShell"))
os = os_name()
question = question.strip()
if not question.endswith("?"):
question += "?"
# TODO: Can be optimised.
return SHELL_PROMPT.replace("{shell}", shell).replace("{os}", os) + question


def code(question: str) -> str:
return CODE_PROMPT + question
2 changes: 1 addition & 1 deletion tests/integrational_tests.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ def test_code_queries(self):
dict_arguments = {
"prompt": (
"Create a command line application using Python that "
"accepts two integer positional command line arguments "
"accepts two positional arguments "
"and prints the result of multiplying them."
),
"--code": True,
Expand Down