Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

community[Qianfan]: re-arrange the addtional_kwargs of returned qianfan structure to avoid _merge_dict issue #18889

Merged

Conversation

Dobiichi-Origami
Copy link
Contributor

fix issue: #18441
PTAL, thanks
@baskaryan, @efriis, @eyurtsev, @hwchase17.

Copy link

vercel bot commented Mar 11, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Ignored Deployment
Name Status Preview Comments Updated (UTC)
langchain ⬜️ Ignored (Inspect) Visit Preview Mar 12, 2024 5:41am

@dosubot dosubot bot added size:S This PR changes 10-29 lines, ignoring generated files. Ɑ: models Related to LLMs or chat model modules 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Mar 11, 2024
@Dobiichi-Origami Dobiichi-Origami changed the title community: re-arrange the addtional_kwargs of returned structure to avoid _merge_dict issue community: re-arrange the addtional_kwargs of returned qianfan structure to avoid _merge_dict issue Mar 11, 2024
@Dobiichi-Origami Dobiichi-Origami changed the title community: re-arrange the addtional_kwargs of returned qianfan structure to avoid _merge_dict issue community[Qianfan]: re-arrange the addtional_kwargs of returned qianfan structure to avoid _merge_dict issue Mar 11, 2024
baskaryan and others added 3 commits March 11, 2024 20:18
…anfan/chunk_issue_fix

# Conflicts:
#	libs/community/langchain_community/chat_models/baidu_qianfan_endpoint.py
@Dobiichi-Origami
Copy link
Contributor Author

a new version after fmt has been uploaded, PTAL.
@baskaryan

@dosubot dosubot bot added the lgtm PR looks good. Use to confirm that a PR is ready for merging. label Mar 12, 2024
@baskaryan baskaryan enabled auto-merge (squash) March 12, 2024 05:41
@baskaryan baskaryan merged commit 471f2ed into langchain-ai:master Mar 12, 2024
59 checks passed
@yu2tao1
Copy link

yu2tao1 commented Mar 15, 2024

@Dobiichi-Origami
hi xiongdi, this commit causes the additional_kwargs to lose the function_call information.

  1. function_call not in AIMessage:

    def _convert_dict_to_message(_dict: Mapping[str, Any]) -> AIMessage:
    content = _dict.get("result", "") or ""
    additional_kwargs: Mapping[str, Any] = {}
    if _dict.get("function_call"):
    additional_kwargs = {"function_call": dict(_dict["function_call"])}
    if "thoughts" in additional_kwargs["function_call"]:
    # align to api sample, which affects the llm function_call output
    additional_kwargs["function_call"].pop("thoughts")
    additional_kwargs = {**_dict.get("body", {}), **additional_kwargs}
    return AIMessage(
    content=content,
    additional_kwargs=dict(
    finish_reason=additional_kwargs.get("finish_reason", ""),
    request_id=additional_kwargs["id"],
    object=additional_kwargs.get("object", ""),
    search_info=additional_kwargs.get("search_info", []),
    ),
    )

  2. but needed at:

    additional_kwargs = msg.additional_kwargs.get("function_call", {})

  3. funtion_call related code may not work.
    such as in bce-qianfan-sdk:

https://github.com/baidubce/bce-qianfan-sdk/blob/dad3b9b3170007cc5411b2c797ce8646df8df186/python/qianfan/extensions/langchain/agents/baidu_qianfan_endpoint.py#L359-L361

you must be a kind person who can fix this issue.

@sonyfoxcn
Copy link

@Dobiichi-Origami hi xiongdi, this commit causes the additional_kwargs to lose the function_call information.

  1. function_call not in AIMessage:
    def _convert_dict_to_message(_dict: Mapping[str, Any]) -> AIMessage:
    content = _dict.get("result", "") or ""
    additional_kwargs: Mapping[str, Any] = {}
    if _dict.get("function_call"):
    additional_kwargs = {"function_call": dict(_dict["function_call"])}
    if "thoughts" in additional_kwargs["function_call"]:
    # align to api sample, which affects the llm function_call output
    additional_kwargs["function_call"].pop("thoughts")
    additional_kwargs = {**_dict.get("body", {}), **additional_kwargs}
    return AIMessage(
    content=content,
    additional_kwargs=dict(
    finish_reason=additional_kwargs.get("finish_reason", ""),
    request_id=additional_kwargs["id"],
    object=additional_kwargs.get("object", ""),
    search_info=additional_kwargs.get("search_info", []),
    ),
    )
  2. but needed at:
    additional_kwargs = msg.additional_kwargs.get("function_call", {})
  3. funtion_call related code may not work.
    such as in bce-qianfan-sdk:

https://github.com/baidubce/bce-qianfan-sdk/blob/dad3b9b3170007cc5411b2c797ce8646df8df186/python/qianfan/extensions/langchain/agents/baidu_qianfan_endpoint.py#L359-L361

you must be a kind person who can fix this issue.

Yes, I got the same issue. It's a poor fix.

bechbd pushed a commit to bechbd/langchain that referenced this pull request Mar 29, 2024
… structure to avoid _merge_dict issue (langchain-ai#18889)

fix issue: langchain-ai#18441
PTAL, thanks
@baskaryan, @efriis, @eyurtsev, @hwchase17.

---------

Co-authored-by: Bagatur <baskaryan@gmail.com>
Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com>
gkorland pushed a commit to FalkorDB/langchain that referenced this pull request Mar 30, 2024
… structure to avoid _merge_dict issue (langchain-ai#18889)

fix issue: langchain-ai#18441
PTAL, thanks
@baskaryan, @efriis, @eyurtsev, @hwchase17.

---------

Co-authored-by: Bagatur <baskaryan@gmail.com>
Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com>
hinthornw pushed a commit that referenced this pull request Apr 26, 2024
… structure to avoid _merge_dict issue (#18889)

fix issue: #18441
PTAL, thanks
@baskaryan, @efriis, @eyurtsev, @hwchase17.

---------

Co-authored-by: Bagatur <baskaryan@gmail.com>
Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature lgtm PR looks good. Use to confirm that a PR is ready for merging. Ɑ: models Related to LLMs or chat model modules size:S This PR changes 10-29 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants