Skip to content

Commit

Permalink
some changes
Browse files Browse the repository at this point in the history
  • Loading branch information
ZeldaFan0225 committed Mar 15, 2023
1 parent d13b099 commit 357dfca
Show file tree
Hide file tree
Showing 18 changed files with 89 additions and 33 deletions.
2 changes: 1 addition & 1 deletion LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -629,7 +629,7 @@ to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.

ChatGPT Discord Bot interacts with OpenAIs GPT-3.5 turbo model to generate text
ChatGPT Discord Bot interacts with OpenAIs Chat Completion API to generate text
Copyright (C) 2023 Zelda_Fan

This program is free software: you can redistribute it and/or modify
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# ChatGPT-Discord-Bot

A basic Discord bot to generate chat completions using OpenAIs GPT-3.5 turbo model.
A basic Discord bot to generate chat completions using OpenAIs Chat Completion API.

**DISCLAIMER:** THIS REPOSITORY IS IN NO WAY ASSOCIATED TO OPENAI
OFFERING THIS CODE IN FORM OF A PUBLIC DISCORD BOT WHICH CAN BE INVITED BY EVERYBODY IS NOT SUPPORTED.
Expand All @@ -26,7 +26,7 @@ The bot has the following features:
- logging to detect tos-breaking prompts
- ability to blacklist

If you want some inspiration on system instructions for the GPT-3.5 model you can view [my repository](https://github.com/ZeldaFan0225/ChatGPT-Discord-Bot-System-Instructions) for it.
If you want some inspiration on system instructions for the GPT model you can view [my repository](https://github.com/ZeldaFan0225/ChatGPT-Discord-Bot-System-Instructions) for it.

## Version Requirements

Expand Down
6 changes: 6 additions & 0 deletions changelog.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
# Changelog

## V1.4.0

- change wording to remove GPT 3.5
- add ability to easily switch to GPT-4
- change how money spent by users is calculated

## V1.3.0

- sql command for easy database access
Expand Down
14 changes: 12 additions & 2 deletions config.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ To see an example look at our [template.config.json](https://github.com/ZeldaFan
"staff_roles": The roles which your staff have. This will bypass filters and cooldowns (ARRAY OF ROLE IDS),
"staff_users": The staff users who don't have any of the staff roles. This will bypass filters and cooldowns (ARRAY OF USER IDS),
"blacklist_roles": Blacklist users based on their roles. Staff have full bypass (ARRAY OF ROLE IDS),
"default_model": The default model to use. Model must support chat completion (STRING) *8,
"staff_can_bypass_feature_restrictions": When set to true staff won't be restricted by features turned off (BOOLEAN) *4,
"dev": Whether this is a development instance or not (BOOLEAN) *3,
"global_user_cooldown": The time until a user can send a new request in milliseconds (NUMBER),
Expand Down Expand Up @@ -42,7 +43,14 @@ To see an example look at our [template.config.json](https://github.com/ZeldaFan
"user_leaderboard": Whether this feature is enabled or not (BOOLEAN) *4
},
"leaderboard_amount_users": How many users to display on the leaderboard (NUMBER),
"englishify_system_instruction": The system instruction to translate a message (STRING) *5
"englishify_system_instruction": The system instruction to translate a message (STRING) *5,
"context_action_instruction": The system instruction for the context action (STRING),
"costs": {
"MODEL NAME": {
"prompt": The cost for prompt tokens,
"completion": The cost for completion tokens
}
}
}
```

Expand All @@ -51,4 +59,6 @@ To see an example look at our [template.config.json](https://github.com/ZeldaFan
`*3` Developer mode will enable logging and will also show the generations ID in the embed in Discord
`*4` This option changes how the command is created.
`*5` It is not recommended to change this option.
`*6` It is not recommended to change this option.
`*6` Configuration of `context_action_instruction` is advised.
`*7` Prices for models to save money spent by users. [Read more about pricing](https://openai.com/pricing)
`*8` See API Documentation for compatibility. [Read more](https://platform.openai.com/docs/models/model-endpoint-compatibility)
2 changes: 1 addition & 1 deletion package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "chatgpt_discord_bot",
"version": "1.3.0",
"version": "1.4.0",
"description": "",
"main": "dist/index.js",
"scripts": {
Expand Down
20 changes: 15 additions & 5 deletions src/classes/client.ts
Original file line number Diff line number Diff line change
Expand Up @@ -91,11 +91,14 @@ export class ChatGPTBotClient extends Client {
}

async requestChatCompletion(messages: {role: string, content: string}[], user_id: string, database: Pool, override_options?: {
temperature?: number
temperature?: number,
model?: string
}) {
const model = override_options?.model || this.config.default_model || "gpt-3.5-turbo"

const openai_req = Centra(`https://api.openai.com/v1/chat/completions`, "POST")
.body({
model: "gpt-3.5-turbo",
model,
messages,
temperature: override_options?.temperature ?? this.config.generation_parameters?.temperature,
top_p: this.config.generation_parameters?.top_p,
Expand Down Expand Up @@ -123,15 +126,22 @@ export class ChatGPTBotClient extends Client {

if(!data?.id) throw new Error("Unable to generate response")

await this.recordSpentTokens(user_id, data.usage.total_tokens ?? 0, database)
await this.recordSpentTokens(user_id, {prompt: data.usage.prompt_tokens, completion: data.usage.completion_tokens}, model, database)

return data
}

async recordSpentTokens(user_id: string, tokens: number, database: Pool) {
async recordSpentTokens(user_id: string, tokens: {prompt: number, completion: number}, model: string, database: Pool) {
if(!this.config.features?.user_stats) return false;

const res = await database.query("UPDATE user_data SET tokens = user_data.tokens + $2 WHERE user_id=$1 RETURNING *", [user_id, tokens]).catch(console.error)
let cost = 0

if(this.config.costs?.[model]) {
cost += (this.config.costs?.[model]?.prompt || 0) * (tokens.prompt / 1000)
cost += (this.config.costs?.[model]?.completion || 0) * (tokens.completion / 1000)
}

const res = await database.query("UPDATE user_data SET tokens=user_data.tokens+$2, cost=user_data.cost+$3 WHERE user_id=$1 RETURNING *", [user_id, (tokens.completion + tokens.prompt), cost]).catch(console.error)
return !!res?.rowCount
}
}
6 changes: 4 additions & 2 deletions src/commands/chat_single.ts
Original file line number Diff line number Diff line change
Expand Up @@ -120,6 +120,8 @@ export default class extends Command {
payload.components = components
}

data.object

if(description.length < 4000) {
const embed = new EmbedBuilder({
author: {
Expand All @@ -128,12 +130,12 @@ export default class extends Command {
},
description,
color: Colors.Green,
footer: {text: "This text has been generated by OpenAIs GPT-3.5 Model"}
footer: {text: `This text has been generated by OpenAIs Chat Completion API (${data.model})`}
})

payload.embeds = [embed]
} else {
const attachment = new AttachmentBuilder(Buffer.from(`${ctx.interaction.user.tag}:\n${message}\n\nChatGPT (${system_instruction_name}):\n${data.choices[0]?.message.content?.trim() ?? "Hi there"}\n\nThis response has been generated using OpenAIs GPT-3.5 model`), {name: `${data.id}.txt`})
const attachment = new AttachmentBuilder(Buffer.from(`${ctx.interaction.user.tag}:\n${message}\n\nChatGPT (${system_instruction_name}):\n${data.choices[0]?.message.content?.trim() ?? "Hi there"}\n\nThis response has been generated using OpenAIs Chat Completion API`), {name: `${data.id}.txt`})
payload.content = "Result attached below"
payload.files = [attachment]
}
Expand Down
10 changes: 5 additions & 5 deletions src/commands/chat_thread.ts
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ export default class extends Command {
},
description: ai_data.choices[0]?.message.content?.trim(),
color: Colors.Blue,
footer: {text: "This text has been generated by OpenAIs GPT-3.5 Model"}
footer: {text: `This text has been generated by OpenAIs Chat Completion API (${ai_data.model})`}
})
]

Expand All @@ -75,7 +75,7 @@ export default class extends Command {
if((embedLength(embeds[0].toJSON()) + embedLength(embeds[1].toJSON())) <= 6000) {
payload = {embeds}
} else {
const attachment = new AttachmentBuilder(Buffer.from(`${ctx.interaction.user.tag}:\n${message}\n\nChatGPT:\n${ai_data.choices[0]?.message.content?.trim() ?? "Hi there"}\n\nThis response has been generated using OpenAIs GPT-3.5 model`), {name: `${ai_data.id}.txt`})
const attachment = new AttachmentBuilder(Buffer.from(`${ctx.interaction.user.tag}:\n${message}\n\nChatGPT:\n${ai_data.choices[0]?.message.content?.trim() ?? "Hi there"}\n\nThis response has been generated using OpenAIs Chat Completion API`), {name: `${ai_data.id}.txt`})
payload = {
content: "Result attached below",
files: [attachment]
Expand Down Expand Up @@ -166,12 +166,12 @@ ${system_instruction ?? "NONE"}`,
},
description,
color: Colors.Green,
footer: {text: "This text has been generated by OpenAIs GPT-3.5 Model"}
footer: {text: `This text has been generated by OpenAIs Chat Completion API (${data.model})`}
})

payload = {embeds: [embed]}
} else {
const attachment = new AttachmentBuilder(Buffer.from(`${ctx.interaction.user.tag}:\n${message}\n\nChatGPT:\n${data.choices[0]?.message.content?.trim() ?? "Hi there"}\n\nThis response has been generated using OpenAIs GPT-3.5 model`), {name: `${data.id}.txt`})
const attachment = new AttachmentBuilder(Buffer.from(`${ctx.interaction.user.tag}:\n${message}\n\nChatGPT:\n${data.choices[0]?.message.content?.trim() ?? "Hi there"}\n\nThis response has been generated using OpenAIs Chat Completion API`), {name: `${data.id}.txt`})
payload = {
content: "Unable to start thread.\nResult attached below",
files: [attachment]
Expand Down Expand Up @@ -204,7 +204,7 @@ ${system_instruction ?? "NONE"}`,
},
description: data.choices[0]?.message.content?.trim(),
color: Colors.Blue,
footer: {text: "This text has been generated by OpenAIs GPT-3.5 Model"}
footer: {text: `This text has been generated by OpenAIs Chat Completion API (${data.model})`}
}),
new EmbedBuilder({
description: !!db_save?.rowCount ? `To create a response to ChatGPTs response use ${await ctx.client.getSlashCommandTag("chat thread")}` : "Unable to save chat for followup",
Expand Down
2 changes: 1 addition & 1 deletion src/commands/info.ts
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ export default class extends Command {
const embed = new EmbedBuilder({
title: "Info",
color: Colors.Blue,
description: `This bot acts as an interface with the OpenAI GPT-3.5 turbo model.\nThis bot is open source and can be viewed on [GitHub](https://github.com/ZeldaFan0225/ChatGPT-Discord-Bot).\n**There is no guarantee that this instance of the bot is unmodified**\n\nCurrent configuration:\n**Logging** ${ctx.client.config.logs?.enabled ? "Enabled" : "Disabled"}`
description: `This bot acts as an interface with the OpenAI Chat Completion API.\nThis bot is open source and can be viewed on [GitHub](https://github.com/ZeldaFan0225/ChatGPT-Discord-Bot).\n**There is no guarantee that this instance of the bot is unmodified**\n\nCurrent configuration:\n**Logging** ${ctx.client.config.logs?.enabled ? "Enabled" : "Disabled"}`
})

return ctx.interaction.reply({
Expand Down
6 changes: 3 additions & 3 deletions src/commands/leaderboard.ts
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ export default class extends Command {
await ctx.interaction.deferReply()
const leaders_query = await ctx.database.query(`SELECT * FROM user_data WHERE user_id != '0' ORDER BY tokens DESC LIMIT ${ctx.client.config.leaderboard_amount_users || 10}`).catch(console.error)
const own_query = await ctx.database.query("SELECT * FROM user_data WHERE user_id=$1", [ctx.interaction.user.id]).catch(console.error)
const total = await ctx.database.query("SELECT SUM(tokens) as total FROM user_data").catch(console.error)
const total = await ctx.database.query("SELECT SUM(tokens) as tokens, SUM(cost) as cost FROM user_data").catch(console.error)

if(!leaders_query?.rowCount || !own_query?.rowCount) return ctx.error({error: "Unable to generate leaderboard", codeblock: true})

Expand All @@ -37,12 +37,12 @@ export default class extends Command {

const lines = await Promise.all(leaders.map(async (l, i) => {
const user = await ctx.client.users.fetch(l.user_id).catch(console.error)
return `${i == (ctx.client.config.leaderboard_amount_users || 10) ? "...\n" : ""}${i == 0 ? "👑" : ""}**${user?.tag ?? "Unknown User#0001"}** \`${l.tokens}\` Tokens (about \`${Math.round(l.tokens/10 * 0.002)/100}$\`)`
return `${i == (ctx.client.config.leaderboard_amount_users || 10) ? "...\n" : ""}${i == 0 ? "👑" : ""}**${user?.tag ?? "Unknown User#0001"}** \`${l.tokens}\` Tokens (about \`${Math.round(l.cost * 100)/100}$\`)`
}))

const embed = new EmbedBuilder({
title: "Spent tokens leaderboard",
description: `${lines.join("\n")}\n\n**Total Tokens** \`${total?.rows?.[0].total ?? 0}\` (about \`${Math.round(Number(total?.rows?.[0].total ?? 0)/10 * 0.002)/100}$\`)\nAll prices are based on estimations, no guarantees that they are right.`.slice(0, 4000),
description: `${lines.join("\n")}\n\n**Total Tokens** \`${total?.rows?.[0].tokens ?? 0}\` (about \`${Math.round((total?.rows?.[0].cost ?? 0) * 100)/100}$\`)\nAll prices are based on estimations, no guarantees that they are right.`.slice(0, 4000),
color: Colors.Green
})

Expand Down
4 changes: 2 additions & 2 deletions src/components/regenerate.ts
Original file line number Diff line number Diff line change
Expand Up @@ -52,12 +52,12 @@ export default class extends Component {
},
description,
color: Colors.Green,
footer: {text: "This text has been generated by OpenAIs GPT-3.5 Model"}
footer: {text: `This text has been generated by OpenAIs Chat Completion API (${data.model})`}
})

payload.embeds = [embed]
} else {
const attachment = new AttachmentBuilder(Buffer.from(`${ctx.interaction.user.tag}:\n${message}\n\nChatGPT (${system_instruction_name}):\n${data.choices[0]?.message.content?.trim() ?? "Hi there"}\n\nThis response has been generated using OpenAIs GPT-3.5 model`), {name: `${data.id}.txt`})
const attachment = new AttachmentBuilder(Buffer.from(`${ctx.interaction.user.tag}:\n${message}\n\nChatGPT (${system_instruction_name}):\n${data.choices[0]?.message.content?.trim() ?? "Hi there"}\n\nThis response has been generated using OpenAIs Chat Completion API`), {name: `${data.id}.txt`})
payload.content = "Result attached below"
payload.files = [attachment]
}
Expand Down
4 changes: 2 additions & 2 deletions src/contexts/context_action.ts
Original file line number Diff line number Diff line change
Expand Up @@ -75,12 +75,12 @@ export default class extends Context {
},
description,
color: Colors.Green,
footer: {text: `Completion with OpenAIs GPT-3.5 model requested by ${ctx.interaction.user.tag}`, icon_url: ctx.interaction.user.displayAvatarURL()}
footer: {text: `Completion with OpenAIs Chat Completion API requested by ${ctx.interaction.user.tag}`, icon_url: ctx.interaction.user.displayAvatarURL()}
})

payload.embeds = [embed]
} else {
const attachment = new AttachmentBuilder(Buffer.from(`${ctx.interaction.targetMessage.author.tag}:\n${ctx.interaction.targetMessage.content}\n\nChatGPT:\n${data.choices[0]?.message.content?.trim() ?? "Hi there"}\n\nThis response has been generated using OpenAIs GPT-3.5 model.\nThe completion has been requested by ${ctx.interaction.user.tag}`), {name: `${data.id}.txt`})
const attachment = new AttachmentBuilder(Buffer.from(`${ctx.interaction.targetMessage.author.tag}:\n${ctx.interaction.targetMessage.content}\n\nChatGPT:\n${data.choices[0]?.message.content?.trim() ?? "Hi there"}\n\nThis response has been generated using OpenAIs Chat Completion API.\nThe completion has been requested by ${ctx.interaction.user.tag}`), {name: `${data.id}.txt`})
payload.content = "Result attached below"
payload.files = [attachment]
}
Expand Down
4 changes: 2 additions & 2 deletions src/contexts/englishify.ts
Original file line number Diff line number Diff line change
Expand Up @@ -72,12 +72,12 @@ export default class extends Context {
},
description,
color: Colors.Green,
footer: {text: `Translation with OpenAIs GPT-3.5 model requested by ${ctx.interaction.user.tag}`, icon_url: ctx.interaction.user.displayAvatarURL()}
footer: {text: `Translation with OpenAIs Chat Completion API requested by ${ctx.interaction.user.tag}`, icon_url: ctx.interaction.user.displayAvatarURL()}
})

payload.embeds = [embed]
} else {
const attachment = new AttachmentBuilder(Buffer.from(`${ctx.interaction.targetMessage.author.tag}:\n${ctx.interaction.targetMessage.content}\n\nChatGPT:\n${data.choices[0]?.message.content?.trim() ?? "Hi there"}\n\nThis response has been generated using OpenAIs GPT-3.5 model.\nThe translation has been requested by ${ctx.interaction.user.tag}`), {name: `${data.id}.txt`})
const attachment = new AttachmentBuilder(Buffer.from(`${ctx.interaction.targetMessage.author.tag}:\n${ctx.interaction.targetMessage.content}\n\nChatGPT:\n${data.choices[0]?.message.content?.trim() ?? "Hi there"}\n\nThis response has been generated using OpenAIs Chat Completion API.\nThe translation has been requested by ${ctx.interaction.user.tag}`), {name: `${data.id}.txt`})
payload.content = "Result attached below"
payload.files = [attachment]
}
Expand Down
4 changes: 2 additions & 2 deletions src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -38,9 +38,9 @@ if(client.config.logs?.enabled) {

client.on("ready", async () => {
await connection.connect().then(async () => {
//console.log(await connection.query("ALTER TABLE user_data ADD COLUMN"))
//console.log(await connection.query("SELECT * FROM user_data"))

await connection.query("CREATE TABLE IF NOT EXISTS user_data (index SERIAL, user_id VARCHAR(100) PRIMARY KEY, consent bool DEFAULT true, tokens int NOT NULL DEFAULT 0, blacklisted bool DEFAULT false)")
await connection.query("CREATE TABLE IF NOT EXISTS user_data (index SERIAL, user_id VARCHAR(100) PRIMARY KEY, consent bool DEFAULT true, tokens int NOT NULL DEFAULT 0, cost double precision default 0, blacklisted bool DEFAULT false)")
await connection.query("CREATE TABLE IF NOT EXISTS chats (index SERIAL, id VARCHAR(100) PRIMARY KEY, user_id VARCHAR(100) NOT NULL, messages JSON[] DEFAULT '{}', created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP)")

console.log("Tables created")
Expand Down
8 changes: 7 additions & 1 deletion src/types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -77,6 +77,7 @@ export interface OpenAIChatCompletionResponse {
id: string,
object: string,
created: number,
model: string,
choices: {
index: number,
message: {
Expand Down Expand Up @@ -118,6 +119,7 @@ export interface Config {
staff_roles?: string[],
staff_users?: string[],
blacklist_roles?: string[],
default_model?: string,
staff_can_bypass_feature_restrictions?: boolean,
dev?: boolean,
global_user_cooldown?: number,
Expand Down Expand Up @@ -155,5 +157,9 @@ export interface Config {
},
leaderboard_amount_users?: number,
englishify_system_instruction?: string,
context_action_instruction?: string
context_action_instruction?: string,
costs?: Record<string, {
prompt?: number,
completion?: number
}>
}

0 comments on commit 357dfca

Please sign in to comment.