Skip to content

tranq72/OpenAIAPI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OpenAIAPI

A light wrapper around OpenAI API, written in Swift.

The full OpenAI platform docs are here: https://platform.openai.com/docs/

What's Implemented

Installation

Swift Package Manager

  1. Select File/Add Packages from xcode menu
  2. Paste https://github.com/tranq72/OpenAIAPI.git

To update, select Packages/Update to Latest Package Versions

Usage

import OpenAIAPI
let openAI = OpenAIAPI(OpenAIAPIConfig(secret: "..."))
let config = OpenAIAPICompletionParms(max_tokens: 500, temperature:0.9)

openAI.createCompletion(prompt: "Write a poem in the style of Dante about Steve Jobs", config: config) { (result:Result<OpenAIAPICompletionResponse, WebServiceError>) in
    switch result {
       case .success(let success):
          dump(success)
       case .failure(let failure):
          print("\(failure.localizedDescription)")
    }
}

💡 OpenAIAPI supports Swift concurrency, e.g.:

Task {
      do {
          let result = try await openAI.retrieveModel("text-davinci-003")
          dump(result)
      } catch {
          print(error.localizedDescription)
      }
}
let config = OpenAIAPIEditParms(n: 2)

openAI.createEdit(instruction:"Fix spelling and grammar", input:"The pens is an the taible", config: config) { (result:Result<OpenAIAPIEditResponse, WebServiceError>) in
    switch result {
       case .success(let success):
          dump(success)
       case .failure(let failure):
          print("\(failure.localizedDescription)")
    }
}
let config = OpenAIAPIAudioParms(prompt: nil, response_format:OpenAIAPIResponseFormat.json.name) //, language: Iso639_1.en.code)

openAI.createTranscription(filedata:audio, filename: "transcript.mp3", config:queryParms) { (result:Result<OpenAIAPIAudioResponse, WebServiceError>) in
    switch result {
       case .success(let success):
          dump(success)
       case .failure(let failure):
          print("\(failure.localizedDescription)")
    }
}

openAI.createTranslation(filedata:audio, filename: "transcript.mp3", config:queryParms) { (result:Result<OpenAIAPIAudioResponse, WebServiceError>) in
    switch result {
       case .success(let success):
          dump(success)
       case .failure(let failure):
          print("\(failure.localizedDescription)")
    }
}
openAI.listModels { (result:Result<OpenAIAPIModelsResponse, WebServiceError>)  in
  switch result {
     case .success(let success):
        dump(success)
     case .failure(let failure):
        print("\(failure.localizedDescription)")
  }
}
openAI.retrieveModel("text-davinci-003") { (result:Result<OpenAIAPIModelResponse, WebServiceError>) in
            switch result {
            case .success(let success):
                dump(success)
            case .failure(let failure):
                dLog("FAILED: \(failure.localizedDescription)")
            }
        }

Query parameters

Default values for most parameters can be overridden for each query using the corresponding configuration objects: OpenAIAPICompletionParms, OpenAIAPIEditParms

For the full list of the supported parameters and their default values see OpenAIAPIQueryParms.swift.
For the supported models see OpenAIAPIModel.swift.
The createTranscription request supports an optional language parameter (string in ISO-639-1 code, see Iso639_1.swift) that can be used as a hint to improve accuracy and latency.

API secret

Keep your API secret secure and away from client apps. Instead of directly calling a third party API (like OpenAI, which is a paid service) you better deploy a reverse-proxy in your backend and set the endpoint and secret parameters accordingly.

Contributing

This is just an initial draft implementation for a side project. Feel free to raise a pull request if you spot a bug or would like to contribute.

License

MIT