Skip to content

Commit

Permalink
docs: update readme with better explanations
Browse files Browse the repository at this point in the history
  • Loading branch information
transitive-bullshit committed Feb 2, 2023
1 parent 9d49e78 commit 6ca8603
Show file tree
Hide file tree
Showing 7 changed files with 95 additions and 31 deletions.
12 changes: 10 additions & 2 deletions docs/classes/ChatGPTAPI.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,15 +28,19 @@ unofficial ChatGPT model.
| `opts` | `Object` | - |
| `opts.apiBaseUrl?` | `string` | **`Default Value`** `'https://api.openai.com'` * |
| `opts.apiKey` | `string` | - |
| `opts.assistantLabel?` | `string` | **`Default Value`** `'ChatGPT'` * |
| `opts.completionParams?` | [`CompletionParams`](../modules/openai.md#completionparams) | - |
| `opts.debug?` | `boolean` | **`Default Value`** `false` * |
| `opts.getMessageById?` | [`GetMessageByIdFunction`](../modules.md#getmessagebyidfunction) | - |
| `opts.maxModelTokens?` | `number` | **`Default Value`** `4096` * |
| `opts.maxResponseTokens?` | `number` | **`Default Value`** `1000` * |
| `opts.messageStore?` | `Keyv`<`any`, `Record`<`string`, `unknown`\>\> | - |
| `opts.upsertMessage?` | [`UpsertMessageFunction`](../modules.md#upsertmessagefunction) | - |
| `opts.userLabel?` | `string` | **`Default Value`** `'User'` * |

#### Defined in

[src/chatgpt-api.ts:36](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/chatgpt-api.ts#L36)
[src/chatgpt-api.ts:47](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/chatgpt-api.ts#L47)

## Methods

Expand All @@ -47,11 +51,15 @@ unofficial ChatGPT model.
Sends a message to ChatGPT, waits for the response to resolve, and returns
the response.

If you want your response to have historical context, you must provide a valid `parentMessageId`.

If you want to receive a stream of partial responses, use `opts.onProgress`.
If you want to receive the full response, including message and conversation IDs,
you can use `opts.onConversationResponse` or use the `ChatGPTAPI.getConversation`
helper.

Set `debug: true` in the `ChatGPTAPI` constructor to log more info on the full prompt sent to the OpenAI completions API. You can override the `promptPrefix` and `promptSuffix` in `opts` to customize the prompt.

#### Parameters

| Name | Type |
Expand All @@ -67,4 +75,4 @@ The response from ChatGPT

#### Defined in

[src/chatgpt-api.ts:109](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/chatgpt-api.ts#L109)
[src/chatgpt-api.ts:145](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/chatgpt-api.ts#L145)
4 changes: 2 additions & 2 deletions docs/classes/ChatGPTError.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ node_modules/.pnpm/typescript@4.9.5/node_modules/typescript/lib/lib.es2022.error

#### Defined in

[src/types.ts:24](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L24)
[src/types.ts:24](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L24)

___

Expand All @@ -74,4 +74,4 @@ ___

#### Defined in

[src/types.ts:25](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L25)
[src/types.ts:25](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L25)
10 changes: 5 additions & 5 deletions docs/interfaces/ChatMessage.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@

#### Defined in

[src/types.ts:20](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L20)
[src/types.ts:20](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L20)

___

Expand All @@ -30,7 +30,7 @@ ___

#### Defined in

[src/types.ts:16](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L16)
[src/types.ts:16](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L16)

___

Expand All @@ -40,7 +40,7 @@ ___

#### Defined in

[src/types.ts:19](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L19)
[src/types.ts:19](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L19)

___

Expand All @@ -50,7 +50,7 @@ ___

#### Defined in

[src/types.ts:18](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L18)
[src/types.ts:18](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L18)

___

Expand All @@ -60,4 +60,4 @@ ___

#### Defined in

[src/types.ts:17](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L17)
[src/types.ts:17](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L17)
8 changes: 4 additions & 4 deletions docs/modules.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ Returns a chat message from a store by it's ID (or null if not found).

#### Defined in

[src/types.ts:29](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L29)
[src/types.ts:29](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L29)

___

Expand All @@ -58,7 +58,7 @@ ___

#### Defined in

[src/types.ts:1](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L1)
[src/types.ts:1](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L1)

___

Expand All @@ -82,7 +82,7 @@ ___

#### Defined in

[src/types.ts:3](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L3)
[src/types.ts:3](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L3)

___

Expand All @@ -108,4 +108,4 @@ Upserts a chat message to a store.

#### Defined in

[src/types.ts:32](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L32)
[src/types.ts:32](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L32)
8 changes: 4 additions & 4 deletions docs/modules/openai.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@

#### Defined in

[src/types.ts:35](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L35)
[src/types.ts:35](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L35)

___

Expand All @@ -59,7 +59,7 @@ ___

#### Defined in

[src/types.ts:117](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L117)
[src/types.ts:117](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L117)

___

Expand All @@ -69,7 +69,7 @@ ___

#### Defined in

[src/types.ts:126](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L126)
[src/types.ts:126](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L126)

___

Expand All @@ -87,4 +87,4 @@ ___

#### Defined in

[src/types.ts:138](https://github.com/transitive-bullshit/chatgpt-api/blob/531e180/src/types.ts#L138)
[src/types.ts:138](https://github.com/transitive-bullshit/chatgpt-api/blob/9d49e78/src/types.ts#L138)
46 changes: 38 additions & 8 deletions docs/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ chatgpt / [Exports](modules.md)

# Update February 1, 2023 <!-- omit in toc -->

This package no longer requires any browser hacks – **it is now using the official OpenAI API** with a leaked, unofficial ChatGPT model. 🔥
This package no longer requires any browser hacks – **it is now using the official OpenAI completions API** with a leaked model that ChatGPT uses under the hood. 🔥

```ts
import { ChatGPTAPI } from 'chatgpt'
Expand All @@ -15,7 +15,9 @@ const res = await api.sendMessage('Hello World!')
console.log(res.text)
```

The updated solution is significantly more lightweight and robust compared with previous versions.
Please upgrade to `chatgpt@latest` (at least [v4.0.0](https://github.com/transitive-bullshit/chatgpt-api/releases/tag/v4.0.0)). The updated version is **significantly more lightweight and robust** compared with previous versions. You also don't have to worry about IP issues or rate limiting!

Huge shoutout to [@waylaidwanderer](https://github.com/waylaidwanderer) for discovering the leaked chat model! 💪

If you run into any issues, we do have a pretty active [Discord](https://discord.gg/v9gERj825w) with a bunch of ChatGPT hackers from the Node.js & Python communities.

Expand Down Expand Up @@ -58,6 +60,8 @@ You can use it to start building projects powered by ChatGPT like chatbots, webs
npm install chatgpt
```

Make sure you're using `node >= 18` so `fetch` is available (or `node >= 14` if you install a [fetch polyfill](https://github.com/developit/unfetch#usage-as-a-polyfill)).

## Usage

Sign up for an [OpenAI API key](https://platform.openai.com/overview) and store it in your environment.
Expand Down Expand Up @@ -92,7 +96,6 @@ res = await api.sendMessage('Can you expand on that?', {
console.log(res.text)

// send another follow-up
// send a follow-up
res = await api.sendMessage('What were we talking about?', {
conversationId: res.conversationId,
parentMessageId: res.id
Expand All @@ -104,20 +107,47 @@ You can add streaming via the `onProgress` handler:

```ts
// timeout after 2 minutes (which will also abort the underlying HTTP request)
const res = await api.sendMessage('Write me a 500 word essay on frogs.', {
onProgress: (partialResponse) => console.log(partialResponse)
const res = await api.sendMessage('Write a 500 word essay on frogs.', {
// print the partial response as the AI is "typing"
onProgress: (partialResponse) => console.log(partialResponse.text)
})

// print the full text at the end
console.log(res.text)
```

You can add a timeout using the `timeoutMs` option:

```ts
// timeout after 2 minutes (which will also abort the underlying HTTP request)
const response = await api.sendMessage('this is a timeout test', {
timeoutMs: 2 * 60 * 1000
const response = await api.sendMessage(
'write me a really really long essay on frogs',
{
timeoutMs: 2 * 60 * 1000
}
)
```

If you want to see more info about what's actually being sent to [OpenAI's completions API](https://platform.openai.com/docs/api-reference/completions), set the `debug: true` option in the `ChatGPTAPI` constructor:

```ts
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY,
debug: true
})
```

You'll notice that we're using a reverse-engineered `promptPrefix` and `promptSuffix`. You can customize these via the `sendMessage` options:

```ts
const res = await api.sendMessage('what is the answer to the universe?', {
promptPrefix: `You are ChatGPT, a large language model trained by OpenAI. You answer as concisely as possible for each response (e.g. don’t be verbose). It is very important that you answer as concisely as possible, so please remember this. If you are generating a list, do not have too many items. Keep the number of items short.
Current date: ${new Date().toISOString()}\n\n`
})
```

Note that we automatically handle appending the previous messages to the prompt and attempt to optimize for the available tokens (which defaults to `4096`).

<details>
<summary>Usage in CommonJS (Dynamic import)</summary>

Expand Down Expand Up @@ -252,7 +282,7 @@ If you create a cool integration, feel free to open a PR and add it to the list.
- This package supports `node >= 14`.
- This module assumes that `fetch` is installed.
- In `node >= 18`, it's installed by default.
- In `node < 18`, you need to install a polyfill like `unfetch/polyfill`
- In `node < 18`, you need to install a polyfill like `unfetch/polyfill` ([guide](https://github.com/developit/unfetch#usage-as-a-polyfill))
- If you want to build a website using `chatgpt`, we recommend using it only from your backend API

## Credits
Expand Down
38 changes: 32 additions & 6 deletions readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ async function example() {
}
```

If you want to track the conversation, use the `conversationId` and `id` in the result object, and pass them to `sendMessage` as `conversationId` and `parentMessageId` respectively.
If you want to track the conversation, use the `conversationId` and `id` in the result object, and pass them to `sendMessage` as `conversationId` and `parentMessageId` respectively. `parentMessageId` is the most important parameter for recalling previous message context.

```ts
const api = new ChatGPTAPI({ apiKey: process.env.OPENAI_API_KEY })
Expand All @@ -94,7 +94,6 @@ res = await api.sendMessage('Can you expand on that?', {
console.log(res.text)

// send another follow-up
// send a follow-up
res = await api.sendMessage('What were we talking about?', {
conversationId: res.conversationId,
parentMessageId: res.id
Expand All @@ -106,20 +105,47 @@ You can add streaming via the `onProgress` handler:

```ts
// timeout after 2 minutes (which will also abort the underlying HTTP request)
const res = await api.sendMessage('Write me a 500 word essay on frogs.', {
onProgress: (partialResponse) => console.log(partialResponse)
const res = await api.sendMessage('Write a 500 word essay on frogs.', {
// print the partial response as the AI is "typing"
onProgress: (partialResponse) => console.log(partialResponse.text)
})

// print the full text at the end
console.log(res.text)
```

You can add a timeout using the `timeoutMs` option:

```ts
// timeout after 2 minutes (which will also abort the underlying HTTP request)
const response = await api.sendMessage('this is a timeout test', {
timeoutMs: 2 * 60 * 1000
const response = await api.sendMessage(
'write me a really really long essay on frogs',
{
timeoutMs: 2 * 60 * 1000
}
)
```

If you want to see more info about what's actually being sent to [OpenAI's completions API](https://platform.openai.com/docs/api-reference/completions), set the `debug: true` option in the `ChatGPTAPI` constructor:

```ts
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY,
debug: true
})
```

You'll notice that we're using a reverse-engineered `promptPrefix` and `promptSuffix`. You can customize these via the `sendMessage` options:

```ts
const res = await api.sendMessage('what is the answer to the universe?', {
promptPrefix: `You are ChatGPT, a large language model trained by OpenAI. You answer as concisely as possible for each response (e.g. don’t be verbose). It is very important that you answer as concisely as possible, so please remember this. If you are generating a list, do not have too many items. Keep the number of items short.
Current date: ${new Date().toISOString()}\n\n`
})
```

Note that we automatically handle appending the previous messages to the prompt and attempt to optimize for the available tokens (which defaults to `4096`).

<details>
<summary>Usage in CommonJS (Dynamic import)</summary>

Expand Down

0 comments on commit 6ca8603

Please sign in to comment.