Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Object of type StreamingStdOutCallbackHandler is not JSON serializable #4085

Closed
mrcaipeng opened this issue May 4, 2023 · 5 comments
Closed

Comments

@mrcaipeng
Copy link

mrcaipeng commented May 4, 2023

[THIS JUST CAN NOT WORK WITH JUPYTER NOTEBOOK]
My code is from https://python.langchain.com/en/latest/modules/models/llms/examples/streaming_llm.html. I didn't change anything. I download the ipynb file and excute in my local jupyter notebook. the version of langchain is 0.0.157. then , I saw the warning and error. the error log as below:

WARNING! callbacks is not default parameter.
callbacks was transfered to model_kwargs.
Please confirm that callbacks is what you intended.
TypeError Traceback (most recent call last)
Cell In[14], line 3
1 llm = OpenAI(streaming=True, callbacks=[StreamingStdOutCallbackHandler()], temperature=0)
2 # llm = OpenAI(streaming=True, temperature=0)
----> 3 resp = llm("Write me a song about sparkling water.")

File /opt/miniconda3/lib/python3.9/site-packages/langchain/llms/base.py:246, in BaseLLM.call(self, prompt, stop)
244 def call(self, prompt: str, stop: Optional[List[str]] = None) -> str:
245 """Check Cache and run the LLM on the given prompt and input."""
--> 246 return self.generate([prompt], stop=stop).generations[0][0].text

File /opt/miniconda3/lib/python3.9/site-packages/langchain/llms/base.py:140, in BaseLLM.generate(self, prompts, stop)
138 except (KeyboardInterrupt, Exception) as e:
139 self.callback_manager.on_llm_error(e, verbose=self.verbose)
--> 140 raise e
141 self.callback_manager.on_llm_end(output, verbose=self.verbose)
142 return output

File /opt/miniconda3/lib/python3.9/site-packages/langchain/llms/base.py:137, in BaseLLM.generate(self, prompts, stop)
133 self.callback_manager.on_llm_start(
134 {"name": self.class.name}, prompts, verbose=self.verbose
135 )
136 try:
--> 137 output = self._generate(prompts, stop=stop)
138 except (KeyboardInterrupt, Exception) as e:
139 self.callback_manager.on_llm_error(e, verbose=self.verbose)

File /opt/miniconda3/lib/python3.9/site-packages/langchain/llms/openai.py:282, in BaseOpenAI._generate(self, prompts, stop)
280 params["stream"] = True
281 response = _streaming_response_template()
--> 282 for stream_resp in completion_with_retry(
283 self, prompt=_prompts, **params
284 ):
285 self.callback_manager.on_llm_new_token(
286 stream_resp["choices"][0]["text"],
287 verbose=self.verbose,
288 logprobs=stream_resp["choices"][0]["logprobs"],
289 )
290 _update_response(response, stream_resp)

File /opt/miniconda3/lib/python3.9/site-packages/langchain/llms/openai.py:102, in completion_with_retry(llm, **kwargs)
98 @retry_decorator
99 def _completion_with_retry(**kwargs: Any) -> Any:
100 return llm.client.create(**kwargs)
--> 102 return _completion_with_retry(**kwargs)

File /opt/miniconda3/lib/python3.9/site-packages/tenacity/init.py:289, in BaseRetrying.wraps..wrapped_f(*args, **kw)
287 @functools.wraps(f)
288 def wrapped_f(*args: t.Any, **kw: t.Any) -> t.Any:
--> 289 return self(f, *args, **kw)

File /opt/miniconda3/lib/python3.9/site-packages/tenacity/init.py:379, in Retrying.call(self, fn, *args, **kwargs)
377 retry_state = RetryCallState(retry_object=self, fn=fn, args=args, kwargs=kwargs)
378 while True:
--> 379 do = self.iter(retry_state=retry_state)
380 if isinstance(do, DoAttempt):
381 try:

File /opt/miniconda3/lib/python3.9/site-packages/tenacity/init.py:314, in BaseRetrying.iter(self, retry_state)
312 is_explicit_retry = fut.failed and isinstance(fut.exception(), TryAgain)
313 if not (is_explicit_retry or self.retry(retry_state)):
--> 314 return fut.result()
316 if self.after is not None:
317 self.after(retry_state)

File /opt/miniconda3/lib/python3.9/concurrent/futures/_base.py:439, in Future.result(self, timeout)
437 raise CancelledError()
438 elif self._state == FINISHED:
--> 439 return self.__get_result()
441 self._condition.wait(timeout)
443 if self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]:

File /opt/miniconda3/lib/python3.9/concurrent/futures/_base.py:391, in Future.__get_result(self)
389 if self._exception:
390 try:
--> 391 raise self._exception
392 finally:
393 # Break a reference cycle with the exception in self._exception
394 self = None

File /opt/miniconda3/lib/python3.9/site-packages/tenacity/init.py:382, in Retrying.call(self, fn, *args, **kwargs)
380 if isinstance(do, DoAttempt):
381 try:
--> 382 result = fn(*args, **kwargs)
383 except BaseException: # noqa: B902
384 retry_state.set_exception(sys.exc_info()) # type: ignore[arg-type]

File /opt/miniconda3/lib/python3.9/site-packages/langchain/llms/openai.py:100, in completion_with_retry.._completion_with_retry(**kwargs)
98 @retry_decorator
99 def _completion_with_retry(**kwargs: Any) -> Any:
--> 100 return llm.client.create(**kwargs)

File /opt/miniconda3/lib/python3.9/site-packages/openai/api_resources/completion.py:25, in Completion.create(cls, *args, **kwargs)
23 while True:
24 try:
---> 25 return super().create(*args, **kwargs)
26 except TryAgain as e:
27 if timeout is not None and time.time() > start + timeout:

File /opt/miniconda3/lib/python3.9/site-packages/openai/api_resources/abstract/engine_api_resource.py:153, in EngineAPIResource.create(cls, api_key, api_base, api_type, request_id, api_version, organization, **params)
127 https://github.com/classmethod
128 def create(
129 cls,
(...)
136 **params,
137 ):
138 (
139 deployment_id,
140 engine,
(...)
150 api_key, api_base, api_type, api_version, organization, **params
151 )
--> 153 response, _, api_key = requestor.request(
154 "post",
155 url,
156 params=params,
157 headers=headers,
158 stream=stream,
159 request_id=request_id,
160 request_timeout=request_timeout,
161 )
163 if stream:
164 # must be an iterator
165 assert not isinstance(response, OpenAIResponse)

File /opt/miniconda3/lib/python3.9/site-packages/openai/api_requestor.py:216, in APIRequestor.request(self, method, url, params, headers, files, stream, request_id, request_timeout)
205 def request(
206 self,
207 method,
(...)
214 request_timeout: Optional[Union[float, Tuple[float, float]]] = None,
215 ) -> Tuple[Union[OpenAIResponse, Iterator[OpenAIResponse]], bool, str]:
--> 216 result = self.request_raw(
217 method.lower(),
218 url,
219 params=params,
220 supplied_headers=headers,
221 files=files,
222 stream=stream,
223 request_id=request_id,
224 request_timeout=request_timeout,
225 )
226 resp, got_stream = self._interpret_response(result, stream)
227 return resp, got_stream, self.api_key

File /opt/miniconda3/lib/python3.9/site-packages/openai/api_requestor.py:509, in APIRequestor.request_raw(self, method, url, params, supplied_headers, files, stream, request_id, request_timeout)
497 def request_raw(
498 self,
499 method,
(...)
507 request_timeout: Optional[Union[float, Tuple[float, float]]] = None,
508 ) -> requests.Response:
--> 509 abs_url, headers, data = self._prepare_request_raw(
510 url, supplied_headers, method, params, files, request_id
511 )
513 if not hasattr(_thread_context, "session"):
514 _thread_context.session = _make_session()

File /opt/miniconda3/lib/python3.9/site-packages/openai/api_requestor.py:481, in APIRequestor._prepare_request_raw(self, url, supplied_headers, method, params, files, request_id)
479 data = params
480 if params and not files:
--> 481 data = json.dumps(params).encode()
482 headers["Content-Type"] = "application/json"
483 else:

File /opt/miniconda3/lib/python3.9/json/init.py:231, in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw)
226 # cached encoder
227 if (not skipkeys and ensure_ascii and
228 check_circular and allow_nan and
229 cls is None and indent is None and separators is None and
230 default is None and not sort_keys and not kw):
--> 231 return _default_encoder.encode(obj)
232 if cls is None:
233 cls = JSONEncoder

File /opt/miniconda3/lib/python3.9/json/encoder.py:199, in JSONEncoder.encode(self, o)
195 return encode_basestring(o)
196 # This doesn't pass the iterator directly to ''.join() because the
197 # exceptions aren't as detailed. The list call should be roughly
198 # equivalent to the PySequence_Fast that ''.join() would do.
--> 199 chunks = self.iterencode(o, _one_shot=True)
200 if not isinstance(chunks, (list, tuple)):
201 chunks = list(chunks)

File /opt/miniconda3/lib/python3.9/json/encoder.py:257, in JSONEncoder.iterencode(self, o, _one_shot)
252 else:
253 _iterencode = _make_iterencode(
254 markers, self.default, _encoder, self.indent, floatstr,
255 self.key_separator, self.item_separator, self.sort_keys,
256 self.skipkeys, _one_shot)
--> 257 return _iterencode(o, 0)

File /opt/miniconda3/lib/python3.9/json/encoder.py:179, in JSONEncoder.default(self, o)
160 def default(self, o):
161 """Implement this method in a subclass such that it returns
162 a serializable object for o, or calls the base implementation
163 (to raise a TypeError).
(...)
177
178 """
--> 179 raise TypeError(f'Object of type {o.class.name} '
180 f'is not JSON serializable')

TypeError: Object of type StreamingStdOutCallbackHandler is not JSON serializable

@Lothiraldan
Copy link
Contributor

Lothiraldan commented May 4, 2023

Hi, I'm facing the same errors with other callbacks too, StdOutCallbackHandler and CometCallbackHandler.

I was able to come up with this reproduction script:

import os

from langchain.callbacks import StdOutCallbackHandler
from langchain.chains import LLMChain
from langchain.llms import OpenAI
from langchain.prompts import PromptTemplate

os.environ["OPENAI_API_KEY"] = "sk-..."

handler = StdOutCallbackHandler()
llm = OpenAI()
prompt = PromptTemplate.from_template("1 + {number} = ")

chain = LLMChain(llm=llm, prompt=prompt, callbacks=[handler])

chain.save("chain.json")

Which is failing with the following traceback:

Traceback (most recent call last):
  File "/tmp/langchain_save_chain_bug.py", line 16, in <module>
    chain.save("chain.json")
  File "/home/lothiraldan/project/cometml/langchain/langchain/chains/base.py", line 300, in save
    json.dump(chain_dict, f, indent=4)
  File "/home/lothiraldan/.pyenv/versions/3.9.15/lib/python3.9/json/__init__.py", line 179, in dump
    for chunk in iterable:
  File "/home/lothiraldan/.pyenv/versions/3.9.15/lib/python3.9/json/encoder.py", line 431, in _iterencode
    yield from _iterencode_dict(o, _current_indent_level)
  File "/home/lothiraldan/.pyenv/versions/3.9.15/lib/python3.9/json/encoder.py", line 405, in _iterencode_dict
    yield from chunks
  File "/home/lothiraldan/.pyenv/versions/3.9.15/lib/python3.9/json/encoder.py", line 325, in _iterencode_list
    yield from chunks
  File "/home/lothiraldan/.pyenv/versions/3.9.15/lib/python3.9/json/encoder.py", line 438, in _iterencode
    o = _default(o)
  File "/home/lothiraldan/.pyenv/versions/3.9.15/lib/python3.9/json/encoder.py", line 179, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type StdOutCallbackHandler is not JSON serializable

I reproduced installing langchain with the following commit on master: a9c2450

Here is the output of pip list:

Package                 Version   Editable project location
----------------------- --------- -------------------------------------------
aiohttp                 3.8.4
aiosignal               1.3.1
async-timeout           4.0.2
attrs                   23.1.0
certifi                 2022.12.7
charset-normalizer      3.1.0
dataclasses-json        0.5.7
frozenlist              1.3.3
greenlet                2.0.2
idna                    3.4
langchain               0.0.157   /home/lothiraldan/project/cometml/langchain
marshmallow             3.19.0
marshmallow-enum        1.5.1
multidict               6.0.4
mypy-extensions         1.0.0
numexpr                 2.8.4
numpy                   1.24.3
openai                  0.27.6
openapi-schema-pydantic 1.2.4
packaging               23.1
pip                     23.0.1
pydantic                1.10.7
PyYAML                  6.0
requests                2.29.0
setuptools              67.6.1
SQLAlchemy              2.0.12
tenacity                8.2.2
tqdm                    4.65.0
typing_extensions       4.5.0
typing-inspect          0.8.0
urllib3                 1.26.15
wheel                   0.40.0
yarl                    1.9.2

I took a quick look and I'm wondering if that could be related to the following change: https://github.com/hwchase17/langchain/pull/3256/files#diff-48b6b4aaa6720728b3c7897708aacfbefa3228a2ae1d4f6342a9db4de1b5531dL24

Let me know how I can help debug this issue further

@agola11
Copy link
Collaborator

agola11 commented May 8, 2023

@mrcaipeng -- I'm unable to repro your issue on the latest version of langchain. Please make sure langchain and openai packages are up to date for you.

@Lothiraldan -- Your issue is caused by the callbacks field not being excluded in the pydantic dict/json. #4364 should fix this along with updating the streaming docs to reference ChatAnthropic instead of Anthropic

EandrewJones pushed a commit to Oogway-Technologies/langchain that referenced this issue May 9, 2023
jpzhangvincent pushed a commit to jpzhangvincent/langchain that referenced this issue May 12, 2023
@Julian-Cao
Copy link

any update here?
how to fix this?

@Lothiraldan
Copy link
Contributor

@agola11 Yes my issue seems resolved, I'm sorry if that wasn't the same issue

@dosubot
Copy link

dosubot bot commented Sep 15, 2023

Hi, @mrcaipeng! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, you reported an issue where the object of type StreamingStdOutCallbackHandler is not JSON serializable when running a code example from the langchain library in a Jupyter Notebook. There have been some developments in the comments, with Lothiraldan reporting similar errors with other callbacks and Agola11 suggesting to make sure both langchain and openai packages are up to date. Lothiraldan later confirmed that their issue seems to be resolved after ensuring the packages are up to date.

Before we close this issue, we wanted to check with you if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself or it will be automatically closed in 7 days.

Thank you for your understanding and contribution to the LangChain project!

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Sep 15, 2023
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Sep 22, 2023
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Sep 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants