-
Notifications
You must be signed in to change notification settings - Fork 13.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tiktoken version is too old for gpt-3.5-turbo
#1881
Comments
Fix #1881 This issue occurs when using `'gpt-3.5-turbo'` with `VectorDBQAWithSourcesChain`
I seem to be encountering the same issue when using
|
I get the same issue when i use AzureOpenAI gpt 3.5 KeyError Traceback (most recent call last)
Cell In[69], line 22
20 PROMPT = PromptTemplate(template=prompt_template, input_variables=["text"])
21 chain = load_summarize_chain(gpt35, chain_type="map_reduce", return_intermediate_steps=True, map_prompt=PROMPT, combine_prompt=PROMPT)
---> 22 chain({"input_documents": docs}, return_only_outputs=True)
File [~/.pyenv/versions/3.10.0/envs/local/lib/python3.10/site-packages/langchain/chains/base.py:116](https://file+.vscode-resource.vscode-cdn.net/Users/nikolasmolyndris/OSINT/Projects/News-Copilot/~/.pyenv/versions/3.10.0/envs/local/lib/python3.10/site-packages/langchain/chains/base.py:116), in Chain.__call__(self, inputs, return_only_outputs)
114 except (KeyboardInterrupt, Exception) as e:
115 self.callback_manager.on_chain_error(e, verbose=self.verbose)
--> 116 raise e
117 self.callback_manager.on_chain_end(outputs, verbose=self.verbose)
118 return self.prep_outputs(inputs, outputs, return_only_outputs)
File [~/.pyenv/versions/3.10.0/envs/local/lib/python3.10/site-packages/langchain/chains/base.py:113](https://file+.vscode-resource.vscode-cdn.net/Users/nikolasmolyndris/OSINT/Projects/News-Copilot/~/.pyenv/versions/3.10.0/envs/local/lib/python3.10/site-packages/langchain/chains/base.py:113), in Chain.__call__(self, inputs, return_only_outputs)
107 self.callback_manager.on_chain_start(
108 {"name": self.__class__.__name__},
109 inputs,
110 verbose=self.verbose,
111 )
112 try:
--> 113 outputs = self._call(inputs)
114 except (KeyboardInterrupt, Exception) as e:
115 self.callback_manager.on_chain_error(e, verbose=self.verbose)
...
72 "Please use `tiktok.get_encoding` to explicitly get the tokeniser you expect."
73 ) from None
75 return get_encoding(encoding_name)
KeyError: 'Could not automatically map gpt-35-turbo to a tokeniser. Please use `tiktok.get_encoding` to explicitly get the tokeniser you expect.' |
Seems to be setup to handle the latest https://github.com/openai/tiktoken/blob/main/tiktoken/model.py#L13 |
Has anyone been able to solve this? |
I had the same issue. It works for me after updating tiktoken version. |
Hi Peter, I'm facing the same issue. Can you please let me know the tiktoken version you used to resolve the issue. As I have updated it to the latest version and langchain also has the updated version. But still it's giving the error |
At the moment, I am using. |
Thanks for your response. just to know if we can use gpt-35-turbo for text summarization? |
Thanks for your response. just to know if we can use gpt-35-turbo for text summarization or RetrievalQuestionAnswering kind of work? |
Any idea why chain_type='map_reduce' can't be used with custom prompt template. Like if we mention chain_type='map_reduce' the method doesn't accept prompt=PROMPT. |
yes you can use. |
The text was updated successfully, but these errors were encountered: