GPT Token Utilities is a small library for encoding and decoding text to and from the tokenized format used by OpenAI's GPT models.
Our token encoder/decoder is optimized for a balance of speed and ease of use. No external dependencies are required.
Everything is written in TypeScript and includes type definitions with full documentation. This library is isomorphic and can be used in both Node and the browser!
GPT Token Utils balances a small footprint with a full-featured API. It's also tree-shakeable, so you can import only the functions you need.
yarn add gpt-token-utils
# or
npm install --save gpt-token-utils
The encode
and decode
exports are the main functions you'll use to work with GPT tokens.
import { encode, decode } from 'gpt-token-utils'
// Encode a string to a list of tokens...
const tokens = encode('Humans are strange creatures, and ever so curious too!')
// You've got a list of tokens!
console.log(tokens) // [32661, 504, 389, 6283, 8109, 11, 290, 1683, 523, 11040, 1165, 0]
// How many tokens are there?
console.log(tokens.length) // 6
// Can we decode it back to text?
console.log(decode(tokens)) // "Humans are strange creatures...."
By default, GPT Token Utils includes a sizable vocabulary and encoder. Alternatively, you can pass in your own to customize the encoding/decoding process.
import {BytePairEncoder} from 'gpt-token-utils/BytePairEncoder'
const tokenEncoder = new BytePairEncoder({...})
import {BytePairDecoder} from 'gpt-token-utils/BytePairDecoder'
const tokenDecoder = new TokenDecoder({...})
GPT Token Utils is licensed under the MIT License. If you've got something cool to share that's built with this library, let us know at @SisterSoftware! We would love to see it!