You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a pretty big upgrade. To ensure it goes smoothly, we must test all usages of the openai library, without using the actual syntax used currently, if that makes sense. Tests should pass on v 0.28 AND v1.0, so we know we have a successful upgrade.
V1.0 Change Documentation:
We have released a new major version of our SDK, and we recommend upgrading promptly.
It's a total rewrite of the library, so many things have changed, but we've made upgrading easy with a code migration script and detailed docs below. It was extensively [beta tested](https://github.com/openai/openai-python/discussions/631) prior to release.
## Getting started
pip install --upgrade openai
## What's changed
* Auto-retry with backoff if there's an error
* Proper types (for mypy/pyright/editors)
* You can now instantiate a client, instead of using a global default.
* Switch to explicit client instantiation
* Weights and Biases CLI will now be included in their own package
## Migration guide
_For Azure OpenAI users, see [Microsoft's Azure-specific migration guide](https://aka.ms/oai/v1-python-migration)._
### Automatic migration with grit
You can automatically migrate your codebase using [grit](https://grit.io/), either [online](https://app.grit.io/migrations/new/openai) or with the following CLI command on Mac or Linux:
openai migrate
The grit binary executes entirely locally with AST-based transforms.
Be sure to audit its changes: we suggest ensuring you have a clean working tree beforehand, and running `git add --patch` afterwards. Note that grit.io also offers opt-in automatic fixes powered by AI.
#### Automatic migration with grit on Windows
To use grit to migrate your code on Windows, you will need to use Windows Subsystem for Linux (WSL). [Installing WSL](https://learn.microsoft.com/en-us/windows/wsl/install) is quick and easy, and you do not need to keep using Linux once the command is done.
Here's a step-by-step guide for setting up and using WSL for this purpose:
1. Open a PowerShell or Command Prompt as an administrator and run `wsl --install`.
2. Restart your computer.
3. Open the WSL application.
4. In the WSL terminal, `cd` into the appropriate directory (e.g., `cd /mnt/c/Users/Myself/my/code/`) and then run the following commands:
```shell
curl -fsSL https://docs.grit.io/install | bash
grit install
grit apply openai
```
Then, you can close WSL and go back to using Windows.
#### Automatic migration with grit in Jupyter Notebooks
If your Jupyter notebooks are not in source control, they will be more difficult to migrate. You may want to copy each cell into grit's web interface, and paste the output back in.
If you need to migrate in a way that preserves use of the module-level client instead of instantiated clients, you can use [the openai_global grit migration](https://app.grit.io/migrations/new/openai_global) instead.
### Initialization
```python
# old
import openai
openai.api_key = os.environ['OPENAI_API_KEY']
# new
from openai import OpenAI
client = OpenAI(
api_key=os.environ['OPENAI_API_KEY'], # this is also the default, it can be omitted
)
Responses
Response objects are now pydantic models and no longer conform to the dictionary shape. However you can easily convert them to a dictionary with model.model_dump().
We highly recommend instantiating client instances instead of relying on the global client.
We also expose a global client instance that is accessible in a similar fashion to versions prior to v1.
importopenai# optional; defaults to `os.environ['OPENAI_API_KEY']`openai.api_key='...'# all client options can be configured just like the `OpenAI` instantiation counterpartopenai.base_url="https://..."openai.default_headers= {"x-foo": "true"}
completion=openai.chat.completions.create(
model="gpt-4",
messages=[
{
"role": "user",
"content": "How do I output all files in a directory using Python?",
},
],
)
print(completion.choices[0].message.content)
The API is the exact same as the standard client instance based API.
This is intended to be used within REPLs or notebooks for faster iteration, not in application code.
We recommend that you always instantiate a client (e.g., with client = OpenAI()) in application code because:
It can be difficult to reason about where client options are configured
It's not possible to change certain client options without potentially causing race conditions
It's harder to mock for testing purposes
It's not possible to control cleanup of network connections
Pagination
All list() methods that support pagination in the API now support automatic iteration, for example:
The Azure API shape differs from the core API shape which means that the static types for responses / params won't always be correct.
fromopenaiimportAzureOpenAI# gets the API Key from environment variable AZURE_OPENAI_API_KEYclient=AzureOpenAI(
# https://learn.microsoft.com/en-us/azure/ai-services/openai/reference#rest-api-versioningapi_version="2023-07-01-preview",
# https://learn.microsoft.com/en-us/azure/cognitive-services/openai/how-to/create-resource?pivots=web-portal#create-a-resourceazure_endpoint="https://example-endpoint.openai.azure.com",
)
completion=client.chat.completions.create(
model="deployment-name", # e.g. gpt-35-instantmessages=[
{
"role": "user",
"content": "How do I output all files in a directory using Python?",
},
],
)
print(completion.model_dump_json(indent=2))
In addition to the options provided in the base OpenAI client, the following options are provided:
azure_endpoint
azure_deployment
api_version
azure_ad_token
azure_ad_token_provider
An example of using the client with Azure Active Directory can be found here.
All name changes
Note: all a* methods have been removed; the async client must be used instead.
Create a virtual environment (Python 3.10 is recommended for v1.0)
python3 -m venv venv
Activate the virtual environment
source venv/bin/activate
Install the required packages
pip install -r requirements.txt
Modify your requirements.txt or Pipfile to specify the new version: openai>=1.0
pip install -U -r requirements.txt
Run the automatic migration script using grit
openai migrate
Run your tests (replace with your testing command)
python -m unittest discover tests
Prepare a rollback plan in case any critical issues are discovered after deploying the updated code with openai v1.0.
Should be done smoothly :)
darinkishore
changed the title
Set up tests for all OpenAI content for a migration to the 1.0 upgrade
Sweep: Set up tests for all OpenAI content for a migration to the 1.0 upgrade
Dec 23, 2023
We need to migrate from openai v 0.28 to >=1.0.
This is a pretty big upgrade. To ensure it goes smoothly, we must test all usages of the openai library, without using the actual syntax used currently, if that makes sense. Tests should pass on v 0.28 AND v1.0, so we know we have a successful upgrade.
V1.0 Change Documentation:
pip install --upgrade openai
openai migrate
Responses
Response objects are now pydantic models and no longer conform to the dictionary shape. However you can easily convert them to a dictionary with
model.model_dump()
.Async client
We do not support calling asynchronous methods in the module-level client, instead you will have to instantiate an async client.
The rest of the API is exactly the same as the synchronous client.
Module client
Important
We highly recommend instantiating client instances instead of relying on the global client.
We also expose a global client instance that is accessible in a similar fashion to versions prior to v1.
The API is the exact same as the standard client instance based API.
This is intended to be used within REPLs or notebooks for faster iteration, not in application code.
We recommend that you always instantiate a client (e.g., with
client = OpenAI()
) in application code because:Pagination
All
list()
methods that support pagination in the API now support automatic iteration, for example:Previously you would have to explicitly call a
.auto_paging_iter()
method instead. See the README for more details.Azure OpenAI
To use this library with Azure OpenAI, use the
AzureOpenAI
class instead of theOpenAI
class.
A more comprehensive Azure-specific migration guide is available on the Microsoft website.
Important
The Azure API shape differs from the core API shape which means that the static types for responses / params won't always be correct.
In addition to the options provided in the base
OpenAI
client, the following options are provided:azure_endpoint
azure_deployment
api_version
azure_ad_token
azure_ad_token_provider
An example of using the client with Azure Active Directory can be found here.
All name changes
Note: all
a*
methods have been removed; the async client must be used instead.openai.api_base
->openai.base_url
openai.proxy
->openai.proxies
(docs)openai.InvalidRequestError
->openai.BadRequestError
openai.Audio.transcribe()
->client.audio.transcriptions.create()
openai.Audio.translate()
->client.audio.translations.create()
openai.ChatCompletion.create()
->client.chat.completions.create()
openai.Completion.create()
->client.completions.create()
openai.Edit.create()
->client.edits.create()
openai.Embedding.create()
->client.embeddings.create()
openai.File.create()
->client.files.create()
openai.File.list()
->client.files.list()
openai.File.retrieve()
->client.files.retrieve()
openai.File.download()
->client.files.retrieve_content()
openai.FineTune.cancel()
->client.fine_tunes.cancel()
openai.FineTune.list()
->client.fine_tunes.list()
openai.FineTune.list_events()
->client.fine_tunes.list_events()
openai.FineTune.stream_events()
->client.fine_tunes.list_events(stream=True)
openai.FineTune.retrieve()
->client.fine_tunes.retrieve()
openai.FineTune.delete()
->client.fine_tunes.delete()
openai.FineTune.create()
->client.fine_tunes.create()
openai.FineTuningJob.create()
->client.fine_tuning.jobs.create()
openai.FineTuningJob.cancel()
->client.fine_tuning.jobs.cancel()
openai.FineTuningJob.delete()
->client.fine_tuning.jobs.create()
openai.FineTuningJob.retrieve()
->client.fine_tuning.jobs.retrieve()
openai.FineTuningJob.list()
->client.fine_tuning.jobs.list()
openai.FineTuningJob.list_events()
->client.fine_tuning.jobs.list_events()
openai.Image.create()
->client.images.generate()
openai.Image.create_variation()
->client.images.create_variation()
openai.Image.create_edit()
->client.images.edit()
openai.Model.list()
->client.models.list()
openai.Model.delete()
->client.models.delete()
openai.Model.retrieve()
->client.models.retrieve()
openai.Moderation.create()
->client.moderations.create()
openai.api_resources
->openai.resources
Removed
openai.api_key_path
openai.app_info
openai.debug
openai.log
openai.OpenAIError
openai.Audio.transcribe_raw()
openai.Audio.translate_raw()
openai.ErrorObject
openai.Customer
openai.api_version
openai.verify_ssl_certs
openai.api_type
openai.enable_telemetry
openai.ca_bundle_path
openai.requestssession
(we now use httpx)openai.aiosession
(we now use httpx)openai.Deployment
(only used for Azure)openai.Engine
openai.File.find_matching_files()
openai.embeddings_utils
(now in the cookbook)The text was updated successfully, but these errors were encountered: