Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 28 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -158,7 +158,9 @@ openai.chat.completions.create(
)
```

## Sorbet Support
## LSP Support

### Sorbet

**This library emits an intentional warning under the [`tapioca` toolchain](https://github.com/Shopify/tapioca)**. This is normal, and does not impact functionality.

Expand All @@ -184,6 +186,31 @@ openai.chat.completions.create(**model)

## Advanced

### Making custom/undocumented requests

This library is typed for convenient access to the documented API.

If you need to access undocumented endpoints, params, or response properties, the library can still be used.

#### Undocumented request params

If you want to explicitly send an extra param, you can do so with the `extra_query`, `extra_body`, and `extra_headers` under the `request_options:` parameter when making a requests as seen in examples above.

#### Undocumented endpoints

To make requests to undocumented endpoints, you can make requests using `client.request`. Options on the client will be respected (such as retries) when making this request.

```ruby
response =
client.request(
method: :post,
path: '/undocumented/endpoint',
query: {"dog": "woof"},
headers: {"useful-header": "interesting-value"},
body: {"he": "llo"},
)
```

### Concurrency & Connection Pooling

The `OpenAI::Client` instances are thread-safe, and should be re-used across multiple threads. By default, each `Client` have their own HTTP connection pool, with a maximum number of connections equal to thread count.
Expand Down
34 changes: 23 additions & 11 deletions lib/openai/internal/transport/base_client.rb
Original file line number Diff line number Diff line change
Expand Up @@ -395,27 +395,39 @@ def initialize(
# Execute the request specified by `req`. This is the method that all resource
# methods call into.
#
# @param req [Hash{Symbol=>Object}] .
# @overload request(method, path, query: {}, headers: {}, body: nil, unwrap: nil, page: nil, stream: nil, model: OpenAI::Internal::Type::Unknown, options: {})
#
# @option req [Symbol] :method
# @param method [Symbol]
#
# @option req [String, Array<String>] :path
# @param path [String, Array<String>]
#
# @option req [Hash{String=>Array<String>, String, nil}, nil] :query
# @param query [Hash{String=>Array<String>, String, nil}, nil]
#
# @option req [Hash{String=>String, Integer, Array<String, Integer, nil>, nil}, nil] :headers
# @param headers [Hash{String=>String, Integer, Array<String, Integer, nil>, nil}, nil]
#
# @option req [Object, nil] :body
# @param body [Object, nil]
#
# @option req [Symbol, nil] :unwrap
# @param unwrap [Symbol, nil]
#
# @option req [Class, nil] :page
# @param page [Class, nil]
#
# @option req [Class, nil] :stream
# @param stream [Class, nil]
#
# @option req [OpenAI::Internal::Type::Converter, Class, nil] :model
# @param model [OpenAI::Internal::Type::Converter, Class, nil]
#
# @param options [OpenAI::RequestOptions, Hash{Symbol=>Object}, nil] .
#
# @option options [String, nil] :idempotency_key
#
# @option options [Hash{String=>Array<String>, String, nil}, nil] :extra_query
#
# @option options [Hash{String=>String, nil}, nil] :extra_headers
#
# @option options [Object, nil] :extra_body
#
# @option options [Integer, nil] :max_retries
#
# @option req [OpenAI::RequestOptions, Hash{Symbol=>Object}, nil] :options
# @option options [Float, nil] :timeout
#
# @raise [OpenAI::Errors::APIError]
# @return [Object]
Expand Down
10 changes: 8 additions & 2 deletions lib/openai/resources/audio/transcriptions.rb
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,10 @@ module OpenAI
module Resources
class Audio
class Transcriptions
# Transcribes audio into the input language.
# See {OpenAI::Resources::Audio::Transcriptions#create_streaming} for streaming
# counterpart.
#
# Transcribes audio into the input language.
#
# @overload create(file:, model:, include: nil, language: nil, prompt: nil, response_format: nil, temperature: nil, timestamp_granularities: nil, request_options: {})
#
Expand Down Expand Up @@ -37,7 +40,10 @@ def create(params)
)
end

# Transcribes audio into the input language.
# See {OpenAI::Resources::Audio::Transcriptions#create} for non-streaming
# counterpart.
#
# Transcribes audio into the input language.
#
# @overload create_streaming(file:, model:, include: nil, language: nil, prompt: nil, response_format: nil, temperature: nil, timestamp_granularities: nil, request_options: {})
#
Expand Down
9 changes: 7 additions & 2 deletions lib/openai/resources/beta/threads.rb
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,9 @@ def delete(thread_id, params = {})
)
end

# Create a thread and run it in one request.
# See {OpenAI::Resources::Beta::Threads#stream_raw} for streaming counterpart.
#
# Create a thread and run it in one request.
#
# @overload create_and_run(assistant_id:, instructions: nil, max_completion_tokens: nil, max_prompt_tokens: nil, metadata: nil, model: nil, parallel_tool_calls: nil, response_format: nil, temperature: nil, thread: nil, tool_choice: nil, tool_resources: nil, tools: nil, top_p: nil, truncation_strategy: nil, request_options: {})
#
Expand Down Expand Up @@ -133,7 +135,10 @@ def create_and_run(params)
)
end

# Create a thread and run it in one request.
# See {OpenAI::Resources::Beta::Threads#create_and_run} for non-streaming
# counterpart.
#
# Create a thread and run it in one request.
#
# @overload stream_raw(assistant_id:, instructions: nil, max_completion_tokens: nil, max_prompt_tokens: nil, metadata: nil, model: nil, parallel_tool_calls: nil, response_format: nil, temperature: nil, thread: nil, tool_choice: nil, tool_resources: nil, tools: nil, top_p: nil, truncation_strategy: nil, request_options: {})
#
Expand Down
20 changes: 16 additions & 4 deletions lib/openai/resources/beta/threads/runs.rb
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,10 @@ class Runs
# @return [OpenAI::Resources::Beta::Threads::Runs::Steps]
attr_reader :steps

# Create a run.
# See {OpenAI::Resources::Beta::Threads::Runs#create_stream_raw} for streaming
# counterpart.
#
# Create a run.
#
# @overload create(thread_id, assistant_id:, include: nil, additional_instructions: nil, additional_messages: nil, instructions: nil, max_completion_tokens: nil, max_prompt_tokens: nil, metadata: nil, model: nil, parallel_tool_calls: nil, reasoning_effort: nil, response_format: nil, temperature: nil, tool_choice: nil, tools: nil, top_p: nil, truncation_strategy: nil, request_options: {})
#
Expand Down Expand Up @@ -52,7 +55,10 @@ def create(thread_id, params)
)
end

# Create a run.
# See {OpenAI::Resources::Beta::Threads::Runs#create} for non-streaming
# counterpart.
#
# Create a run.
#
# @overload create_stream_raw(thread_id, assistant_id:, include: nil, additional_instructions: nil, additional_messages: nil, instructions: nil, max_completion_tokens: nil, max_prompt_tokens: nil, metadata: nil, model: nil, parallel_tool_calls: nil, reasoning_effort: nil, response_format: nil, temperature: nil, tool_choice: nil, tools: nil, top_p: nil, truncation_strategy: nil, request_options: {})
#
Expand Down Expand Up @@ -202,7 +208,10 @@ def cancel(run_id, params)
)
end

# When a run has the `status: "requires_action"` and `required_action.type` is
# See {OpenAI::Resources::Beta::Threads::Runs#submit_tool_outputs_stream_raw} for
# streaming counterpart.
#
# When a run has the `status: "requires_action"` and `required_action.type` is
# `submit_tool_outputs`, this endpoint can be used to submit the outputs from the
# tool calls once they're all completed. All outputs must be submitted in a single
# request.
Expand Down Expand Up @@ -236,7 +245,10 @@ def submit_tool_outputs(run_id, params)
)
end

# When a run has the `status: "requires_action"` and `required_action.type` is
# See {OpenAI::Resources::Beta::Threads::Runs#submit_tool_outputs} for
# non-streaming counterpart.
#
# When a run has the `status: "requires_action"` and `required_action.type` is
# `submit_tool_outputs`, this endpoint can be used to submit the outputs from the
# tool calls once they're all completed. All outputs must be submitted in a single
# request.
Expand Down
8 changes: 6 additions & 2 deletions lib/openai/resources/chat/completions.rb
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,9 @@ class Completions
# @return [OpenAI::Resources::Chat::Completions::Messages]
attr_reader :messages

# **Starting a new project?** We recommend trying
# See {OpenAI::Resources::Chat::Completions#stream_raw} for streaming counterpart.
#
# **Starting a new project?** We recommend trying
# [Responses](https://platform.openai.com/docs/api-reference/responses) to take
# advantage of the latest OpenAI platform features. Compare
# [Chat Completions with Responses](https://platform.openai.com/docs/guides/responses-vs-chat-completions?api-mode=responses).
Expand Down Expand Up @@ -77,7 +79,9 @@ def create(params)
)
end

# **Starting a new project?** We recommend trying
# See {OpenAI::Resources::Chat::Completions#create} for non-streaming counterpart.
#
# **Starting a new project?** We recommend trying
# [Responses](https://platform.openai.com/docs/api-reference/responses) to take
# advantage of the latest OpenAI platform features. Compare
# [Chat Completions with Responses](https://platform.openai.com/docs/guides/responses-vs-chat-completions?api-mode=responses).
Expand Down
8 changes: 6 additions & 2 deletions lib/openai/resources/completions.rb
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,9 @@
module OpenAI
module Resources
class Completions
# Creates a completion for the provided prompt and parameters.
# See {OpenAI::Resources::Completions#create_streaming} for streaming counterpart.
#
# Creates a completion for the provided prompt and parameters.
#
# @overload create(model:, prompt:, best_of: nil, echo: nil, frequency_penalty: nil, logit_bias: nil, logprobs: nil, max_tokens: nil, n: nil, presence_penalty: nil, seed: nil, stop: nil, stream_options: nil, suffix: nil, temperature: nil, top_p: nil, user: nil, request_options: {})
#
Expand Down Expand Up @@ -44,7 +46,9 @@ def create(params)
)
end

# Creates a completion for the provided prompt and parameters.
# See {OpenAI::Resources::Completions#create} for non-streaming counterpart.
#
# Creates a completion for the provided prompt and parameters.
#
# @overload create_streaming(model:, prompt:, best_of: nil, echo: nil, frequency_penalty: nil, logit_bias: nil, logprobs: nil, max_tokens: nil, n: nil, presence_penalty: nil, seed: nil, stop: nil, stream_options: nil, suffix: nil, temperature: nil, top_p: nil, user: nil, request_options: {})
#
Expand Down
8 changes: 6 additions & 2 deletions lib/openai/resources/responses.rb
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,9 @@ class Responses
# @return [OpenAI::Resources::Responses::InputItems]
attr_reader :input_items

# Creates a model response. Provide
# See {OpenAI::Resources::Responses#stream_raw} for streaming counterpart.
#
# Creates a model response. Provide
# [text](https://platform.openai.com/docs/guides/text) or
# [image](https://platform.openai.com/docs/guides/images) inputs to generate
# [text](https://platform.openai.com/docs/guides/text) or
Expand Down Expand Up @@ -57,7 +59,9 @@ def create(params)
)
end

# Creates a model response. Provide
# See {OpenAI::Resources::Responses#create} for non-streaming counterpart.
#
# Creates a model response. Provide
# [text](https://platform.openai.com/docs/guides/text) or
# [image](https://platform.openai.com/docs/guides/images) inputs to generate
# [text](https://platform.openai.com/docs/guides/text) or
Expand Down
2 changes: 2 additions & 0 deletions rbi/lib/openai/internal/transport/base_client.rbi
Original file line number Diff line number Diff line change
Expand Up @@ -166,6 +166,8 @@ module OpenAI

# Execute the request specified by `req`. This is the method that all resource
# methods call into.
#
# @overload request(method, path, query: {}, headers: {}, body: nil, unwrap: nil, page: nil, stream: nil, model: OpenAI::Internal::Type::Unknown, options: {})
sig do
params(
method: Symbol,
Expand Down
10 changes: 8 additions & 2 deletions rbi/lib/openai/resources/audio/transcriptions.rbi
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,10 @@ module OpenAI
module Resources
class Audio
class Transcriptions
# Transcribes audio into the input language.
# See {OpenAI::Resources::Audio::Transcriptions#create_streaming} for streaming
# counterpart.
#
# Transcribes audio into the input language.
sig do
params(
file: T.any(IO, StringIO),
Expand Down Expand Up @@ -66,7 +69,10 @@ module OpenAI
)
end

# Transcribes audio into the input language.
# See {OpenAI::Resources::Audio::Transcriptions#create} for non-streaming
# counterpart.
#
# Transcribes audio into the input language.
sig do
params(
file: T.any(IO, StringIO),
Expand Down
9 changes: 7 additions & 2 deletions rbi/lib/openai/resources/beta/threads.rbi
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,9 @@ module OpenAI
)
end

# Create a thread and run it in one request.
# See {OpenAI::Resources::Beta::Threads#stream_raw} for streaming counterpart.
#
# Create a thread and run it in one request.
sig do
params(
assistant_id: String,
Expand Down Expand Up @@ -243,7 +245,10 @@ module OpenAI
)
end

# Create a thread and run it in one request.
# See {OpenAI::Resources::Beta::Threads#create_and_run} for non-streaming
# counterpart.
#
# Create a thread and run it in one request.
sig do
params(
assistant_id: String,
Expand Down
20 changes: 16 additions & 4 deletions rbi/lib/openai/resources/beta/threads/runs.rbi
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,10 @@ module OpenAI
sig { returns(OpenAI::Resources::Beta::Threads::Runs::Steps) }
attr_reader :steps

# Create a run.
# See {OpenAI::Resources::Beta::Threads::Runs#create_stream_raw} for streaming
# counterpart.
#
# Create a run.
sig do
params(
thread_id: String,
Expand Down Expand Up @@ -177,7 +180,10 @@ module OpenAI
)
end

# Create a run.
# See {OpenAI::Resources::Beta::Threads::Runs#create} for non-streaming
# counterpart.
#
# Create a run.
sig do
params(
thread_id: String,
Expand Down Expand Up @@ -474,7 +480,10 @@ module OpenAI
)
end

# When a run has the `status: "requires_action"` and `required_action.type` is
# See {OpenAI::Resources::Beta::Threads::Runs#submit_tool_outputs_stream_raw} for
# streaming counterpart.
#
# When a run has the `status: "requires_action"` and `required_action.type` is
# `submit_tool_outputs`, this endpoint can be used to submit the outputs from the
# tool calls once they're all completed. All outputs must be submitted in a single
# request.
Expand Down Expand Up @@ -505,7 +514,10 @@ module OpenAI
)
end

# When a run has the `status: "requires_action"` and `required_action.type` is
# See {OpenAI::Resources::Beta::Threads::Runs#submit_tool_outputs} for
# non-streaming counterpart.
#
# When a run has the `status: "requires_action"` and `required_action.type` is
# `submit_tool_outputs`, this endpoint can be used to submit the outputs from the
# tool calls once they're all completed. All outputs must be submitted in a single
# request.
Expand Down
8 changes: 6 additions & 2 deletions rbi/lib/openai/resources/chat/completions.rbi
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,9 @@ module OpenAI
sig { returns(OpenAI::Resources::Chat::Completions::Messages) }
attr_reader :messages

# **Starting a new project?** We recommend trying
# See {OpenAI::Resources::Chat::Completions#stream_raw} for streaming counterpart.
#
# **Starting a new project?** We recommend trying
# [Responses](https://platform.openai.com/docs/api-reference/responses) to take
# advantage of the latest OpenAI platform features. Compare
# [Chat Completions with Responses](https://platform.openai.com/docs/guides/responses-vs-chat-completions?api-mode=responses).
Expand Down Expand Up @@ -275,7 +277,9 @@ module OpenAI
)
end

# **Starting a new project?** We recommend trying
# See {OpenAI::Resources::Chat::Completions#create} for non-streaming counterpart.
#
# **Starting a new project?** We recommend trying
# [Responses](https://platform.openai.com/docs/api-reference/responses) to take
# advantage of the latest OpenAI platform features. Compare
# [Chat Completions with Responses](https://platform.openai.com/docs/guides/responses-vs-chat-completions?api-mode=responses).
Expand Down
Loading