-
Notifications
You must be signed in to change notification settings - Fork 4.3k
Closed as not planned
Labels
questionFurther information is requestedFurther information is requested
Description
Describe the bug
I've noticed that using get_embedding
repeatedly does not actually embeds the tokens but returns the previous result. Side by side comparison of from openai.embeddings_utils import get_embedding
and tiktoken
with just a length check:
Generating Text.
1536 1320
1536 1707
1536 1986
1536 1901
1536 2239
1536 2429
1536 2406
1536 2364
Final Length: 1328 (which is normal)
To Reproduce
It should be simple enough to count tokens for different files with both get_embedding
and tiktoken
.
Code snippets
No response
OS
macOs
Python version
Python 3
Library version
openai-python
Metadata
Metadata
Assignees
Labels
questionFurther information is requestedFurther information is requested