You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Both of our new embedding models were trained with a technique that allows developers to trade-off performance and cost of using embeddings. Specifically, developers can shorten embeddings (i.e. remove some numbers from the end of the sequence) without the embedding losing its concept-representing properties by passing in the dimensions API parameter. For example, on the MTEB benchmark, a text-embedding-3-large embedding can be shortened to a size of 256 while still outperforming an unshortened text-embedding-ada-002 embedding with a size of 1536.
Based on the API reference, this is available via a new parameter on the embedding request called dimensions (optional integer).
Additional context
I'm happy to write a PR for this.
The text was updated successfully, but these errors were encountered:
Describe the feature or improvement you're requesting
From the blog on new embedding models:
Based on the API reference, this is available via a new parameter on the embedding request called
dimensions
(optional integer).Additional context
I'm happy to write a PR for this.
The text was updated successfully, but these errors were encountered: