Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixing some errors #157

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

BenoitAnastay
Copy link

@BenoitAnastay BenoitAnastay commented Aug 1, 2023

Serval fix to make it work

@Summercuisine
Copy link

The problem is still not resolved. After running, it still shows an error.

@BenoitAnastay
Copy link
Author

The problem is still not resolved. After running, it still shows an error.

Which one?

I didn't fix everything

@Summercuisine
Copy link

问题仍未解决。运行后,它仍然显示错误。

哪一个?

我没有解决所有问题

Traceback (most recent call last):
File "C:\Users\Administrator\Desktop\Free-Auto-GPT-main\BABYAGI.py", line 194, in
baby_agi({"objective": OBJECTIVE})
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\chains\base.py", line 258, in call
raise e
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\chains\base.py", line 252, in call
self._call(inputs, run_manager=run_manager)
File "C:\Users\Administrator\Desktop\Free-Auto-GPT-main\BabyAgi\BabyAGIMod.py", line 146, in _call
result = self.execute_task(objective, task["task_name"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\Desktop\Free-Auto-GPT-main\BabyAgi\BabyAGIMod.py", line 110, in execute_task
return self.execution_chain.run(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\chains\base.py", line 456, in run
return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\chains\base.py", line 258, in call
raise e
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\chains\base.py", line 252, in call
self._call(inputs, run_manager=run_manager)
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\agents\agent.py", line 1035, in _call
next_step_output = self._take_next_step(
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\agents\agent.py", line 832, in _take_next_step
output = self.agent.plan(
^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\agents\agent.py", line 456, in plan
full_output = self.llm_chain.predict(callbacks=callbacks, **full_inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\chains\llm.py", line 252, in predict
return self(kwargs, callbacks=callbacks)[self.output_key]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\chains\base.py", line 258, in call
raise e
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\chains\base.py", line 252, in call
self._call(inputs, run_manager=run_manager)
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\chains\llm.py", line 92, in _call
response = self.generate([inputs], run_manager=run_manager)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\chains\llm.py", line 102, in generate
return self.llm.generate_prompt(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\llms\base.py", line 455, in generate_prompt
return self.generate(prompt_strings, stop=stop, callbacks=callbacks, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\llms\base.py", line 586, in generate
output = self._generate_helper(
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\llms\base.py", line 492, in _generate_helper
raise e
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\llms\base.py", line 479, in _generate_helper
self._generate(
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\llms\base.py", line 967, in _generate
else self._call(prompt, stop=stop, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\Desktop\Free-Auto-GPT-main\FreeLLM\ChatGPTAPI.py", line 47, in _call
response = self.chatbot(prompt)
^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\llms\base.py", line 790, in call
self.generate(
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\llms\base.py", line 586, in generate
output = self._generate_helper(
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\llms\base.py", line 492, in _generate_helper
raise e
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\llms\base.py", line 479, in _generate_helper
self._generate(
File "C:\Users\Administrator\miniconda3\Lib\site-packages\langchain\llms\base.py", line 967, in _generate
else self.call(prompt, stop=stop, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\miniconda3\Lib\site-packages\gpt4_openai_init
.py", line 35, in _call
for data in self.chatbot.ask(prompt=prompt,
File "C:\Users\Administrator\miniconda3\Lib\site-packages\revChatGPT\V1.py", line 610, in ask
yield from self.post_messages(
File "C:\Users\Administrator\miniconda3\Lib\site-packages\revChatGPT\V1.py", line 563, in post_messages
yield from self.__send_request(
File "C:\Users\Administrator\miniconda3\Lib\site-packages\revChatGPT\V1.py", line 429, in __send_request
raise ValueError(f"Field missing. Details: {str(line)}")

@BenoitAnastay
Copy link
Author

OpenAI API isn't fixed, it need more workaround, it's better to use another provider

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants