diff --git a/README.md b/README.md index 0c97b74..8d25dc2 100644 --- a/README.md +++ b/README.md @@ -22,7 +22,7 @@ to your questions, right within the editor. This package is available from [JCS-ELPA](https://jcs-emacs.github.io/jcs-elpa/). Install from these repositories then you should be good to go! -Normall, you don't need to add `(require 'codegpt)` to your confiugration since +Normally, you don't need to add `(require 'codegpt)` to your confiugration since most `codegpt` commands are autoload and can be called without loading the module! #### use-package @@ -86,6 +86,14 @@ List of supported commands, | `codegpt-explain` | Explain the selected code | | `codegpt-improve` | Improve, refactor or optimize it | +## ๐Ÿ“ Customization + +#### ๐Ÿงช Variables + +- `codegpt-model` - ID of the model to use. +- `codegpt-max-tokens` - The maximum number of tokens to generate in the completion. +- `codegpt-temperature` - What sampling temperature to use. + ## Contribute [![PRs Welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](http://makeapullrequest.com) diff --git a/codegpt.el b/codegpt.el index b463b57..cc42ec1 100644 --- a/codegpt.el +++ b/codegpt.el @@ -58,6 +58,21 @@ :type 'list :group 'codegpt) +(defcustom codegpt-model "text-davinci-003" + "ID of the model to use." + :type 'string + :group 'codegpt) + +(defcustom codegpt-max-tokens 4000 + "The maximum number of tokens to generate in the completion." + :type 'integer + :group 'codegpt) + +(defcustom codegpt-temperature 1.0 + "What sampling temperature to use." + :type 'number + :group 'openai) + ;; ;;; Application @@ -91,7 +106,10 @@ boundaries of that region in buffer." (insert (string-trim result) "\n") (fill-region original-point (point)))) (unless codegpt-focus-p - (select-window original-window)))) + (select-window original-window))) + :model codegpt-model + :max-tokens codegpt-max-tokens + :temperature codegpt-temperature) (unless codegpt-focus-p (select-window original-window)))))