TypeError: Type is not JSON serializable: numpy.float64 #17876
Replies: 7 comments 2 replies
-
🤖 Hey @maximeperrindev, great to see you back! Running into some quirky bugs, I see? Let's get to the bottom of this. The error you're encountering, "TypeError: Type is not JSON serializable: numpy.float64", typically occurs when you're trying to serialize a numpy.float64 object into JSON format. JSON serialization is a process of converting an object state to a JSON formatted string. However, not all Python objects can be serialized into JSON format. In Python's json module, only Python's primitive datatypes and collections are serializable by default. In your case, there is no direct evidence of numpy.float64 being used or serialized into JSON. However, the error might be occurring because the data being processed in the LangChain application contains numpy.float64 objects. These objects might be part of the input data or might be generated during the processing of the data. The error might be occurring in the following lines of code: data = {"chunk": log_entry["streamed_output"][0]} data = {"chunk": state["streamed_output"][0]} output: Output = await asyncio.create_task(coro, context=context) # type: ignore output = await coro await run_manager.on_chain_end(dumpd(output)) In these lines, if the "streamed_output" or the output of the coroutine contains numpy.float64 objects, it might cause the error when these objects are being serialized into JSON format. To fix this issue, you can convert numpy.float64 objects into Python's native float type before serializing them into JSON. Here is an example of how you can do this: import numpy as np
# Assuming `data` is a numpy.float64 object
data = np.float64(10.5)
# Convert numpy.float64 to Python's native float type
data = float(data)
# Now, `data` can be serialized into JSON format Please note that this is a general solution. The actual solution might vary depending on the structure of your data and where exactly the error is occurring. Regarding the structure of the data being processed in the LangChain application, it is structured as a query. This query is represented by the I hope this helps! If you have any more questions or need further clarification, feel free to ask. Sources
|
Beta Was this translation helpful? Give feedback.
-
@dosu-bot I don't even know the object is... just try to find what's wrong with my own code |
Beta Was this translation helpful? Give feedback.
-
🤖 The error you're encountering is due to the fact that The To resolve this issue, you could convert the import numpy as np
import orjson
class CustomEncoder:
def dumps(self, obj):
if isinstance(obj, np.float64):
return orjson.dumps(obj.item())
return orjson.dumps(obj)
# Replace the original serializer with the custom one
self._serializer = CustomEncoder() This custom encoder checks if the object to be serialized is a Please note that this is a general solution and might need to be adapted depending on the specifics of your code. If this doesn't solve your issue, could you please provide more information about the data that is being processed when the error occurs? This would help in identifying a more specific solution. Sources
|
Beta Was this translation helpful? Give feedback.
-
@dosu-bot def _build_chain() -> Runnable:
vector_store = MilvusVectorStore(dim=1536)
retriever = _get_retriever(vector_store)
llm = ChatOpenAI(model="gpt-3.5-turbo").configurable_alternatives(
ConfigurableField(id="llm"),
default_key="gpt-3.5-turbo",
gpt_4_turbo_preview=ChatOpenAI(model="gpt-4-turbo-preview")
)
retriever_chain = create_retriever_chain(llm, retriever) | RunnableLambda(
format_docs
).with_config(run_name="FormatDocumentChunks")
_context = RunnableMap(
{
"context": retriever_chain.with_config(run_name="RetrievalChain"),
"question": RunnableLambda(itemgetter("question")).with_config(
run_name="Itemgetter:question"
),
"chat_history": RunnableLambda(itemgetter("chat_history")).with_config(
run_name="Itemgetter:chat_history"
),
}
)
prompt = ChatPromptTemplate.from_messages(
[
("system", RESPONSE_TEMPLATE),
MessagesPlaceholder(variable_name="chat_history"),
("human", "{question}"),
]
).partial(current_date=datetime.now().isoformat())
response_synthesizer = (prompt | llm | StrOutputParser()).with_config(
run_name="GenerateResponse",
)
return (
{
"question": RunnableLambda(itemgetter("question")).with_config(
run_name="Itemgetter:question"
),
"chat_history": RunnableLambda(itemgetter("chat_history")).with_config(
run_name="SerializeHistory"
),
}
| _context
| response_synthesizer
)
chain = _build_chain()
chain_with_history = RunnableWithMessageHistory(
chain,
get_chat_history,
input_messages_key="question",
history_messages_key="chat_history",
)
def add_routes(app: FastAPI) -> None:
"""Add routes to the FastAPI app."""
langserve_add_routes(
app,
chain_with_history,
# disabled_endpoints=["playground", "batch"],
) How can i fix it ? |
Beta Was this translation helpful? Give feedback.
-
@maximeperrindev it looks like either the input or output (probably output) of one of the chains is a numpy array. If you want to output it and are sending the data over a web-server, you need to provide a way to encode the data as json. The easiest thing to do is add another runnable lambda that takes the numpy and outputs a string representation of the numpy that can be sent over the wire (i.e., search online for ways to encode JSON) |
Beta Was this translation helpful? Give feedback.
-
EmbeddingsFiltert return Sequence[_DocumentWithState] I'm thinking how to resolve it . |
Beta Was this translation helpful? Give feedback.
-
I fix it . I rewrite EmbeddingsFilter,return class ToDocEmbeddingsFilter(EmbeddingsFilter):
def compress_documents(
self,
documents: Sequence[Document],
query: str,
callbacks: Optional[Callbacks] = None,
) -> Sequence[Document]:
docs = []
for doc in super().compress_documents( documents, query, callbacks):
docs.append(doc.to_document())
return docs
def get_retriever():
# embeddings = OpenAIEmbeddings()
embeddings = get_embeddings_model()
splitter = RecursiveCharacterTextSplitter(chunk_size=800, chunk_overlap=20)
relevance_filter = ToDocEmbeddingsFilter(embeddings=embeddings, similarity_threshold=0.1)
|
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
RuntimeError: super(): class cell not found
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 758, in call
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 778, in app
await route.handle(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 299, in handle
await self.app(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 79, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
await response(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/sse_starlette/sse.py", line 255, in call
async with anyio.create_task_group() as task_group:
File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 597, in aexit
raise exceptions[0]
File "/usr/local/lib/python3.11/site-packages/sse_starlette/sse.py", line 258, in wrap
await func()
File "/usr/local/lib/python3.11/site-packages/sse_starlette/sse.py", line 245, in stream_response
async for data in self.body_iterator:
File "/usr/local/lib/python3.11/site-packages/langserve/api_handler.py", line 1085, in _stream_log
"data": self._serializer.dumps(data).decode("utf-8"),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/langserve/serialization.py", line 168, in dumps
return orjson.dumps(obj, default=default)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: Type is not JSON serializable: numpy.float64
Description
I'm trying to run langserve api with RAG and Runnable with message history but i'm facing this bug
System Info
System Information
Package Information
Packages not installed (Not Necessarily a Problem)
The following packages were not found:
Beta Was this translation helpful? Give feedback.
All reactions