Skip to content

Commit

Permalink
Support gpt-3.5-turbo (#23)
Browse files Browse the repository at this point in the history
Co-authored-by: Fredrick R. Brennan <copypaste@kittens.ph>
  • Loading branch information
rikhuijzer and ctrlcctrlv committed Mar 7, 2023
1 parent 20eaff0 commit bfd544e
Show file tree
Hide file tree
Showing 6 changed files with 245 additions and 64 deletions.
42 changes: 9 additions & 33 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
<h1 align="center"><code>ata</code>: Ask the Terminal Anything</h1>

<h3 align="center">OpenAI GPT in the terminal</h3>
<h3 align="center">ChatGPT in the terminal</h3>

<p align="center">
<a href="https://asciinema.org/a/557270"><img src="https://asciinema.org/a/557270.svg" alt="asciicast"></a>
Expand All @@ -12,8 +12,6 @@ TIP:<br>
This can be done via: Iterm2 (Mac), Guake (Ubuntu), scratchpad (i3/sway), or the quake mode for the Windows Terminal.
</h3>

At the time of writing, use `text-davinci-003`. Davinci was released together with ChatGPT as part of the [GPT-3.5 series](https://platform.openai.com/docs/model-index-for-researchers/models-referred-to-as-gpt-3-5) and they are very comparable in terms of capabilities; ChatGPT is more verbose.

## Productivity benefits

- The terminal starts more quickly and requires **less resources** than a browser.
Expand All @@ -26,35 +24,11 @@ At the time of writing, use `text-davinci-003`. Davinci was released together wi
Download the binary for your system from [Releases](https://github.com/rikhuijzer/ata/releases).
If you're running Arch Linux, then you can use the AUR packages: [ata](https://aur.archlinux.org/packages/ata), [ata-git](https://aur.archlinux.org/packages/ata-git), or [ata-bin](https://aur.archlinux.org/packages/ata-bin).

Request an API key via <https://beta.openai.com/account/api-keys>.
Next, set the API key, the model that you want to use, and the maximum amount of tokens that the server can respond with in `ata.toml`:

```toml
api_key = "<YOUR SECRET API KEY>"
model = "text-davinci-003"
max_tokens = 500
temperature = 0.8
```

Here, replace `<YOUR SECRET API KEY>` with your API key, which you can request via https://beta.openai.com/account/api-keys.

The `max_tokens` sets the maximum amount of tokens that the server will answer with.

The `temperature` sets the `sampling temperature`. From the OpenAI API docs: "What sampling temperature to use. Higher values means the model will take more risks. Try 0.9 for more creative applications, and 0 (argmax sampling) for ones with a well-defined answer." According to Stephen Wolfram [[1]], setting it to a higher value such as 0.8 will likely work best in practice.

[1]: https://writings.stephenwolfram.com/2023/02/what-is-chatgpt-doing-and-why-does-it-work/
To specify the API key and some basic model settings, start the application.
It should give an error and the option to create a configuration file called `ata.toml` for you.
Press `y` and `ENTER` to create a `ata.toml` file.

Next, run:

```sh
$ ata --config=ata.toml
```

Or, change the current directory to the one where `ata.toml` is located and run

```sh
$ ata
```
Next, request an API key via <https://beta.openai.com/account/api-keys> and update the key in the example configuration file.

For more information, see:

Expand All @@ -66,8 +40,10 @@ $ ata --help

**How much will I have to pay for the API?**

Using OpenAI's API is quite cheap, I have been using this terminal application heavily for a few weeks now and my costs are about $0.15 per day ($4.50 per month).
The first $18.00 for free, so you can use it for about 120 days (4 months) before having to pay.
Using OpenAI's API for chat is very cheap.
Let's say that an average response is about 500 tokens, so costs $0.001.
That means that if you do 100 requests per day, then that will cost you about $0.10.
OpenAI grants you $18.00 for free, so you can use the API for about 180 days (6 months) before having to pay.

**Can I build the binary myself?**

Expand Down
8 changes: 5 additions & 3 deletions ata/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,19 +1,21 @@
[package]
name = "ata"
version = "1.0.3"
version = "2.0.0"
edition = "2021"
authors = ["Rik Huijzer <t.h.huijzer@rug.nl>", "Fredrick R. Brennan <copypaste@kittens.ph>"]
license = "MIT"

[dependencies]
rustyline = "10.1"
clap = { version = "4.1.4", features = ["derive"] }
directories = "4.0"
hyper = { version = "0.14", features = ["full"] }
hyper-rustls = { version = "0.23" }
os_str_bytes = "6.4.1"
rustyline = "10.1"
serde = { version = "1", features = ["derive"] }
serde_json = { version = "1" }
tokio = { version = "1", features = ["full"] }
toml = { version = "0.6" }
clap = { version = "4.1.4", features = ["derive"] }

[dev-dependencies]
pretty_assertions = "1"
114 changes: 114 additions & 0 deletions ata/src/config.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,114 @@
use directories::ProjectDirs;
use os_str_bytes::OsStrBytes as _;
use os_str_bytes::OsStringBytes as _;
use serde::Deserialize;
use std::convert::Infallible;
use std::ffi::OsString;
use std::path::Path;
use std::path::PathBuf;
use std::str::FromStr;
use toml::de::Error as TomlError;

#[derive(Clone, Deserialize, Debug)]
pub struct Config {
pub api_key: String,
pub model: String,
pub max_tokens: i64,
pub temperature: f64
}

#[derive(Clone, Deserialize, Debug, Default)]
pub enum ConfigLocation {
#[default]
Auto,
Path(PathBuf),
Named(PathBuf),
}

impl FromStr for ConfigLocation {
type Err = Infallible;

fn from_str(s: &str) -> Result<Self, Self::Err> {
Ok(if !s.contains('.') && !s.is_empty() {
Self::Named(s.into())
} else if !s.is_empty() {
Self::Path(s.into())
} else if s.trim().is_empty() {
Self::Auto
} else {
unreachable!()
})
}
}

impl<S> From<S> for ConfigLocation where S: AsRef<str> {
fn from(s: S) -> Self {
Self::from_str(s.as_ref()).unwrap()
}
}

fn get_config_dir() -> PathBuf {
ProjectDirs::from(
"ata",
"Ask the Terminal Anything (ATA) Project Authors",
"ata",
).unwrap().config_dir().into()
}

pub fn default_path(name: Option<&Path>) -> PathBuf {
let mut config_file = get_config_dir();
let file: Vec<_> = if let Some(name) = name {
let mut name = name.to_path_buf();
name.set_extension("toml");
name.as_os_str()
.to_raw_bytes()
.iter()
.copied()
.collect()
} else {
"ata.toml".bytes().collect()
};
let file = OsString::assert_from_raw_vec(file);
config_file.push(&file);
config_file
}

impl ConfigLocation {
pub fn location(&self) -> PathBuf {
match self {
ConfigLocation::Auto => {
if self.location().exists() {
return self.location();
}
default_path(None)
}
ConfigLocation::Path(pb) => {
pb.clone()
},
ConfigLocation::Named(name) => {
if name.as_os_str() == "default" {
return match Path::new("ata.toml").exists() {
true => Path::new(&"ata.toml").into(),
false => default_path(None)
};
}
default_path(Some(name))
}
}
}
}

impl FromStr for Config {
type Err = TomlError;

fn from_str(contents: &str) -> Result<Self, Self::Err> {
toml::from_str(contents)
}
}

impl<S> From<S> for Config where S: AsRef<str> {
fn from(s: S) -> Self {
Self::from_str(s.as_ref())
.unwrap_or_else(|e| panic!("Config parsing failure!: {:?}", e))
}
}
54 changes: 46 additions & 8 deletions ata/src/help.rs
Original file line number Diff line number Diff line change
@@ -1,3 +1,9 @@
use crate::config;
use rustyline::Editor;
use std::fs::File;
use std::fs;
use std::io::Write as _;

pub fn commands() {
println!("
Ctrl-A, Home Move cursor to the beginning of line
Expand Down Expand Up @@ -29,29 +35,61 @@ Thanks to <https://github.com/kkawakam/rustyline#emacs-mode-default-mode>.
");
}

const EXAMPLE_TOML: &str = r#"api_key = "<YOUR SECRET API KEY>"
model = "gpt-3.5-turbo"
max_tokens = 1000
temperature = 0.8"#;

pub fn missing_toml(args: Vec<String>) {
let default_path = config::default_path(None);
eprintln!(
r#"
Could not find the file `ata.toml`. To fix this, use `{} --config=<Path to ata.toml>` or have `ata.toml` in the current dir.
Could not find a configuration file.
For example, make a new file `ata.toml` in the current directory with the following content (the text between the ```):
To fix this, use `{} --config=<Path to ata.toml>` or create `{1}`. For the last option, type `y` to write the following example file:
```
api_key = "<YOUR SECRET API KEY>"
model = "text-davinci-003"
max_tokens = 500
temperature = 0.8
{EXAMPLE_TOML}
```
Here, replace `<YOUR SECRET API KEY>` with your API key, which you can request via https://beta.openai.com/account/api-keys.
Next, replace `<YOUR SECRET API KEY>` with your API key, which you can request via https://beta.openai.com/account/api-keys.
The `max_tokens` sets the maximum amount of tokens that the server will answer with.
The `temperature` sets the `sampling temperature`. From the OpenAI API docs: "What sampling temperature to use. Higher values means the model will take more risks. Try 0.9 for more creative applications, and 0 (argmax sampling) for ones with a well-defined answer." According to Stephen Wolfram [1], setting it to a higher value such as 0.8 will likely work best in practice.
[1]: https://writings.stephenwolfram.com/2023/02/what-is-chatgpt-doing-and-why-does-it-work/
"#,
args[0],
default_path.display()
);

let mut rl = Editor::<()>::new().unwrap();
let msg = format!(
"\x1b[1mDo you want me to write this example file to {0:?} for you to edit? [y/N]\x1b[0m",
default_path
);
let readline = rl.readline(&msg);
if let Ok(msg) = readline {
let response: bool = msg
.trim()
.chars()
.next()
.map(|c| c.to_lowercase().collect::<String>() == "y")
.unwrap_or(false);
if response {
if !default_path.exists() && !default_path.parent().unwrap().is_dir() {
let dir = default_path.parent().unwrap();
fs::create_dir_all(dir).expect("Could not make configuration directory");
}
let mut f = File::create(&default_path).expect("Unable to create file");
f.write_all(EXAMPLE_TOML.as_bytes())
.expect("Unable to write to file");
println!();
println!("Wrote to {default_path:?}.");
}
}

"#, args[0]);
std::process::exit(1);
}

Loading

0 comments on commit bfd544e

Please sign in to comment.