Skip to content

Latest commit

 

History

History
385 lines (297 loc) · 8.95 KB

gpt-config.en.md

File metadata and controls

385 lines (297 loc) · 8.95 KB

📖 GPT-Runner Configs And AI Preset Files

📚 Table of Contents


✨ Introduction

  1. When you start GPT-Runner, it first reads the project-level configuration file.

  2. This isn't necessary, but it's useful when you want to override some global configurations.

  3. Sorted by priority, it prefers to read the file at topmost.

<rootPath>/.gpt-runner/gptr.config.ts
<rootPath>/.gpt-runner/gptr.config.js
<rootPath>/.gpt-runner/gptr.config.json

<rootPath>/gptr.config.ts
<rootPath>/gptr.config.js
<rootPath>/gptr.config.json
  1. Then GPT-Runner will deeply retrieve all *.gpt.md files under the current folder.

  2. This process defaults to skipping the files in the project's .gitignore which saves time.

  3. You can change the retrieval range by configuring the gptr.config.ts file.

  4. Each *.gpt.md file is parsed into an AI preset.

📂 .gpt-runner Directory

  1. <rootPath>/.gpt-runner/ directory is a special directory. Even if you include it in .gitignore, it will be retrieved. This is useful for people who hope GPT-Runner doesn't intrude into the project.

  2. You can put both gptr.config.json and *.gpt.md files in this directory.

  3. Then add .gpt-runner in .gitignore. So you can keep the project clean and let GPT-Runner read the configuration files at the same time.

  4. If you want to git ignore the .gpt-runner directory once and for all, you can execute this command to achieve global git ignore:

git config --global core.excludesfile '~/.gitignore_global'

echo '.gpt-runner' >> ~/.gitignore_global

📄 gptr.config.ts/js/json Configuration Files

  1. gpt.config.ts/js/json is a configuration file, it can override project-level global configurations.

  2. Its configuration type is as follows

export interface UserConfig {
  /**
   * Model configuration
   */
  model?: ModelConfig

  /**
   * Deep retrieval includes file paths, support glob
   * @default null
   */
  includes?: string | RegExp | string[] | RegExp[] | ((source: string) => boolean) | null

  /**
   * Deep retrieval excludes file paths, support glob
   * @default [
        "** /node_modules",
        "** /.git",
        "** /__pycache__",
        "** /.Python",
        "** /.DS_Store",
        "** /.cache",
        "** /.next",
        "** /.nuxt",
        "** /.out",
        "** /dist",
        "** /.serverless",
        "** /.parcel-cache"
      ]
   */
  excludes?: string | RegExp | string[] | RegExp[] | ((source: string) => boolean) | null

  /**
   * Skip the files in .gitignore
   * Recommended to turn on, this can save retrieval time
   * @default true
   */
  respectGitIgnore?: boolean

  /**
   * api url configuration
   * @default {}
   * @example
   * {
   *   "https://api.openai.com/*": {
   *     "modelNames": ["gpt-3.5-turbo-16k", "gpt-4"],
   *     "httpRequestHeader": {
   *       "User-Agent": "GPT-Runner"
   *     }
   *   }
   * }
   */
  urlConfig?: {
    [urlMatch: string]: {
      /**
       * The model name that will be displayed in the model selector
       */
      modelNames?: string[]

      /**
       * Additional request headers are required
       */
      httpRequestHeader?: Record<string, string>
    }
  }
}

export interface ModelConfig {
  /**
   * Model type
   */
  type?: 'openai' | 'anthropic'

  /**
   * Model name
   */
  modelName?: string

  // ...more configurations please refer to specific model
}
  1. You can use defineConfig function in gptr.config.ts to configure UserConfig type configuration file. You can install @nicepkg/gpt-runner package.
npm i @nicepkg/gpt-runner
  1. You can create a new gptr.config.ts file, then fill in the sample configuration:
import { defineConfig } from '@nicepkg/gpt-runner'

export default defineConfig({
  model: {
    type: 'openai',
    modelName: 'gpt-3.5-turbo-16k',
    temperature: 0.9,
  },
})
  1. Of course, you can also install our VSCode plugin, it will automatically prompt your configuration file based on our JSON Schema.

  2. This is the simple example for gptr.config.json:

{
  "model": {
    "type": "openai",
    "modelName": "gpt-3.5-turbo-16k"
  }
}
  1. This is the complete example for gptr.config.json:
{
  "model": {
    "type": "openai",
    "modelName": "gpt-3.5-turbo-16k",
    "temperature": 0.9,
    "maxTokens": 2000,
    "topP": 1,
    "frequencyPenalty": 0,
    "presencePenalty": 0
  },
  "includes": null,
  "excludes": [
    "**/node_modules",
    "**/.git",
    "**/__pycache__",
    "**/.Python",
    "**/.DS_Store",
    "**/.cache",
    "**/.next",
    "**/.nuxt",
    "**/.out",
    "**/dist",
    "**/.serverless",
    "**/.parcel-cache"
  ],
  "respectGitIgnore": true,
  "urlConfig": {
    "https://openrouter.ai/*": {
      "modelNames": [
        "openai/gpt-3.5-turbo-16k",
        "openai/gpt-4",
        "openai/gpt-4-32k"
      ],
      "httpRequestHeader": {
        "HTTP-Referer": "http://localhost:3003/",
        "X-Title": "localhost"
      }
    }
  }
}

📑 xxx.gpt.md AI Preset Files

  1. xxx.gpt.md files are AI preset files, each file represents an AI character.

  2. For example, a uni-test.gpt.md is specifically for this project to write unit tests, and a doc.gpt.md is specifically for this project to write documentation.

  3. It has great value and can be reused by team members.

  4. Why not xxx.gpt.json? Because in that case, the content within System Prompt and User Prompt often need to escape characters, which makes it very troublesome to write.

  5. It's easy to write, read, and maintain xxx.gpt.md.

  6. A minimalist AI preset file looks like this:

```json
{
  "title": "Category/AI character name"
}
```

# System Prompt

You're a coding master specializing in refactoring code. Please follow SOLID, KISS and DRY principles, and refactor this section of code to make it better.
  1. A complete AI preset file looks like this:
```json
{
  "title": "Category/AI Character Name",
  "model": {
    "type": "openai",
    "modelName": "gpt-3.5-turbo-16k",
    "temperature": 0.9,
    "maxTokens": 2000,
    "topP": 1,
    "frequencyPenalty": 0,
    "presencePenalty": 0
  }
}
```


# System Prompt

You are a coding master, skilled at refactoring code. Please adhere to the SOLID, KISS and DRY principles, and refactor this code to make it better.

# User Prompt

When you use this preset to create a new chat, the User Prompt text will automatically fill in the chat input box. You can edit it before sending it to the AI robot.

# Remark

You can write your remarks here. 

`model` / `modelName` / `temperature` / `System Prompt` / `User Prompt` are all **optional** parameters, and there are many more to customize.

You can also override many default parameter values through the `gptr.config.json` at the root directory of the project.

🤖 Chat Model Configuration

OpenAI

Official Request Parameters Documentation

export interface OpenaiModelConfig {
  type: 'openai'

  /**
   * Model name
   */
  modelName: string

  /**
   * Temperature
   */
  temperature?: number

  /**
   * Max reply token number
   */
  maxTokens?: number

  /**
   * Total probability mass of tokens per step
   */
  topP?: number

  /**
   * Penalize repeated tokens according to frequency
   */
  frequencyPenalty?: number

  /**
   * Penalizes repeated tokens
   */
  presencePenalty?: number
}

Anthropic

Official Request Parameters Documentation

export interface AnthropicModelConfig {
  type: 'anthropic'

  /**
   * Model name
   */
  modelName: string

  /**
   * Temperature
   */
  temperature?: number

  /**
   * Max reply token number
   */
  maxTokens?: number

  /**
   * Total probability mass of tokens per step
   */
  topP?: number

  /**
   * Only sample subsequent choices from the top K options
   */
  topK?: number
}

🔍 Other

  1. If you have installed GPT Runner VSCode extension. You can set in .vscode/settings.json:
{
  "[markdown]": {
    "editor.quickSuggestions": {
      "other": true,
      "comments": false,
      "strings": true
    }
  }
}

Thus, in xxx.gpt.md file, you can open suggestions and fast code snippets, for instance, create a new test.gpt.md file, type in gptr then hit Enter, you will quickly get a simple AI preset file.

  1. In the future, we will support more llm models