Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OutputParserException: Could not parse LLM output: df = [] for i in range(len(df)): #7024

Closed
2 of 14 tasks
gunterzhang480 opened this issue Jul 1, 2023 · 1 comment
Closed
2 of 14 tasks
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@gunterzhang480
Copy link

System Info

OutputParserException: Could not parse LLM output: df = [] for i in range(len(df)):

Who can help?

No response

Information

  • The official example notebooks/scripts
  • My own modified scripts

Related Components

  • LLMs/Chat Models
  • Embedding Models
  • Prompts / Prompt Templates / Prompt Selectors
  • Output Parsers
  • Document Loaders
  • Vector Stores / Retrievers
  • Memory
  • Agents / Agent Executors
  • Tools / Toolkits
  • Chains
  • Callbacks/Tracing
  • Async

Reproduction

I used open source LLM, HuggingFaceHub(repo_id = "google/flan-t5-xxl"), but not OpenAI LLM, to create agent below:

agent = create_csv_agent using HuggingFaceHub(repo_id = "google/flan-t5-xxl"), r"D:\ML\titanic\titanic.csv", verbose=True)
the agent created successfully:

but when I used the agent below, I got error above:

agent.run("Plot a bar chart comparing Survived vs Dead for Embarked")
agent.run("Find the count of missing values in each column")
agent.run("Fix the issues with missing values for column Embarked with mode")
agent.run("Drop cabin column")

Expected behavior

I should get plot or other .

@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Jul 1, 2023
@dosubot
Copy link

dosubot bot commented Sep 30, 2023

Hi, @gunterzhang480! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, you reported an issue related to an OutputParserException that occurs when trying to parse LLM output. It seems that you are using an open source LLM and encountering this error when running certain commands with the agent.

There hasn't been any activity or comments on the issue since you reported it. Before we close this issue, we wanted to check if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself or it will be automatically closed in 7 days.

Thank you for your understanding and cooperation!

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Sep 30, 2023
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Oct 7, 2023
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Oct 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

1 participant