Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama: 'NoneType' object has no attribute 'request' #1208

Closed
evrenyal opened this issue Apr 18, 2024 · 19 comments · Fixed by #1267
Closed

Ollama: 'NoneType' object has no attribute 'request' #1208

evrenyal opened this issue Apr 18, 2024 · 19 comments · Fixed by #1267
Labels
bug Something isn't working

Comments

@evrenyal
Copy link

evrenyal commented Apr 18, 2024

Hi

I got errors like:

 opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
opendevin:ERROR: agent_controller.py:175 - LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=llama2
 Pass model as E.g. For 'Huggingface' inference endpoints pass in `completion(model='huggingface/starcoder',..)` Learn more: https://docs.litellm.ai/docs/providers

My observations :

  • I use Manjaro Linux.
  • Ollama is working correctly. I tried it.
  • Web ui is working and I selected ollama/llama2 in the settings
  • My docker (Docker version 25.0.3, build 4debf411d1) command is :
sudo docker run \
    -e LLM_API_KEY="ollama" \
    -e LLM_MODEL="ollama/llama2" \
    -e LLM_EMBEDDING_MODEL="local" \
    -e LLM_BASE_URL="http://localhost:11434" \
    -e WORKSPACE_DIR="./workspace" \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -e SANDBOX_TYPE="exec" \
    -p 3000:3000 \
    ghcr.io/opendevin/opendevin:latest

If there is something more stable than Llama, I can try that too.

@evrenyal evrenyal added the bug Something isn't working label Apr 18, 2024
@rbren
Copy link
Collaborator

rbren commented Apr 18, 2024

@evrenyal OpenAI and Claude seem to be much more stable than ollama

@evrenyal
Copy link
Author

thank you for reply @rbren I'm not talking about performance. The Ollama doesn't work at all. I keep getting these errors.

@enyst
Copy link
Collaborator

enyst commented Apr 18, 2024

According to this doc, the model name needs to be the full model name, as seen in ollama list. Can you please try that?

@evrenyal
Copy link
Author

evrenyal commented Apr 18, 2024

ollama list
NAME            	ID          	SIZE  	MODIFIED   
llama2:latest   	78e26419b446	3.8 GB	2 days ago	
orca-mini:latest	2dbd9f439647	2.0 GB	2 days ago
docker run \
    -e LLM_API_KEY="ollama" \
    -e LLM_MODEL="ollama/llama2:latest" \
    -e LLM_EMBEDDING_MODEL="local" \
    -e LLM_BASE_URL="http://localhost:11434" \
    -e WORKSPACE_DIR="./workspace" \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -e SANDBOX_TYPE="exec" \
    -p 3000:3000 \
    ghcr.io/opendevin/opendevin:latest

Unfortunately, I tried again but it didn't work. @enyst

@enyst
Copy link
Collaborator

enyst commented Apr 18, 2024

Can you please paste the errors now? I'm not sure where the problem is if the settings are taken into account, but I wonder first if they were applied, or you may need to set the model in the UI. We've been changing this behavior lately afaik.

@Umpire2018
Copy link
Contributor

Umpire2018 commented Apr 19, 2024

@evrenyal That's strange, i can't re-produce your error.

export LLM_MODEL="ollama/gemma:2b"
export LLM_API_KEY="ollama"
export LLM_EMBEDDING_MODEL="local"
export WORKSPACE_DIR="./workspace"
export LLM_BASE_URL="http://localhost:11434"

python main.py --task "write a bash script that prints hello"

01:38:51 - opendevin:INFO: llm.py:25 - Initializing LLM with model: ollama/gemma:2b
01:38:52 - opendevin:INFO: ssh_box.py:271 - Container stopped
01:38:52 - opendevin:WARNING: ssh_box.py:283 - Using port forwarding for Mac OS. Server started by OpenDevin will not be accessible from the host machine at the moment. See https://github.com/OpenDevin/OpenDevin/issues/897 for more information.
01:38:52 - opendevin:INFO: ssh_box.py:309 - Container started
01:38:53 - opendevin:INFO: ssh_box.py:326 - waiting for container to start: 1, container status: running
01:38:55 - opendevin:INFO: agent_controller.py:154 - STEP 0
01:38:55 - opendevin:INFO: agent_controller.py:155 - write a bash script that prints hello
01:43:06 - opendevin:INFO: llm.py:25 - Initializing LLM with model: ollama/gemma:2b
01:43:06 - opendevin:INFO: ssh_box.py:271 - Container stopped
01:43:06 - opendevin:WARNING: ssh_box.py:283 - Using port forwarding for Mac OS. Server started by OpenDevin will not be accessible from the host machine at the moment. See https://github.com/OpenDevin/OpenDevin/issues/897 for more information.
01:43:07 - opendevin:INFO: ssh_box.py:309 - Container started
01:43:08 - opendevin:INFO: ssh_box.py:326 - waiting for container to start: 1, container status: running
01:43:09 - opendevin:INFO: agent_controller.py:154 - STEP 0
01:43:09 - opendevin:INFO: agent_controller.py:155 - write a bash script that prints hello
01:46:34 - opendevin:ERROR: agent_controller.py:175 - opendevin.action.agent.AgentThinkAction() argument after ** must be a mapping, not str
01:46:34 - opendevin:INFO: agent_controller.py:202 - opendevin.action.agent.AgentThinkAction() argument after ** must be a mapping, not str
01:46:34 - opendevin:INFO: agent_controller.py:154 - STEP 1
01:46:34 - opendevin:INFO: agent_controller.py:155 - write a bash script that prints hello
01:47:50 - opendevin:INFO: agent_controller.py:172 - AgentThinkAction(thought="It seems like there might be an existing project here. I should probably start by running `ls` to see what's here.", action=<ActionType.THINK: 'think'>)
01:47:50 - opendevin:INFO: agent_controller.py:154 - STEP 2
01:47:50 - opendevin:INFO: agent_controller.py:155 - write a bash script that prints hello
01:49:03 - opendevin:INFO: agent_controller.py:172 - AgentThinkAction(thought="It seems like there might be an existing project here. I should probably start by running `ls` to see what's here.", action=<ActionType.THINK: 'think'>)
01:49:03 - opendevin:INFO: agent_controller.py:154 - STEP 3
01:49:03 - opendevin:INFO: agent_controller.py:155 - write a bash script that prints hello

@SmartManoj
Copy link
Collaborator

Run this to check whether LLM is working properly.

import tomllib as toml
from litellm import completion
from datetime import datetime
file_path=r'config.toml'
config = toml.load(open(file_path,'rb'))

messages = [{ "content": "If there are 10 books in a room and I read 2, how many books are still in the room?","role": "user"}]
dt = datetime.now()
response = completion(model=config['LLM_MODEL'], 
                        api_key=config['LLM_API_KEY'],
                        base_url=config.get('LLM_BASE_URL'),
                      messages=messages)

print(response.choices[0].message.content)

dt2 = datetime.now()
print('Used model:',config['LLM_MODEL'])
print(f"Time taken: {(dt2-dt).total_seconds():.1f}s")

@MatthewSaintBull
Copy link

having the same issue here. I noticed that not always it takes correctly the env vars, but always prints out the same error. Also trying to run

python main.py --task "write a bash script that prints hello"

from docker, it says

'NoneType' object has no attribute 'request'

@SmartManoj
Copy link
Collaborator

SmartManoj commented Apr 19, 2024

Need full traceback

@evrenyal
Copy link
Author

@Umpire2018

 export LLM_MODEL="ollama/llama2"
export LLM_API_KEY="ollama"
export LLM_EMBEDDING_MODEL="local"
export WORKSPACE_DIR="./workspace"
export LLM_BASE_URL="http://localhost:11434"
root@33803e5885c6:/app/opendevin# python main.py --task "write a bash script that prints hello"
Running agent MonologueAgent (model: ollama/llama2, directory: None) with task: "write a bash script that prints hello"
12:15:47 - opendevin:INFO: llm.py:25 - Initializing LLM with model: ollama/llama2
12:15:47 - opendevin:INFO: exec_box.py:185 - Container stopped
12:15:47 - opendevin:INFO: exec_box.py:203 - Container started


==============
STEP 0

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:48 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:48 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 1

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:48 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:48 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 2

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:48 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:48 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 3

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:48 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:48 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 4

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:48 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:48 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 5

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:48 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:48 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 6

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:48 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:48 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 7

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:48 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:48 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 8

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:48 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:48 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 9

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:48 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:48 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 10

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:48 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:48 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 11

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:48 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:48 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 12

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:48 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:48 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 13

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 14

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 15

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 16

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 17

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 18

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 19

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 20

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 21

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 22

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 23

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 24

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 25

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 26

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 27

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 28

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 29

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 30

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 31

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 32

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 33

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 34

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 35

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 36

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 37

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 38

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:49 - OBSERVATION
'NoneType' object has no attribute 'request'


==============
STEP 39

12:15:49 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:49 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'

@spoonbobo
Copy link
Contributor

Exact issue with you @evrenyal , step 99 in 1 second X_X

docker run \
    -e LLM_API_KEY="ollama" \
    -e LLM_MODEL="ollama/llama3:8b" \
    -e LLM_EMBEDDING_MODEL="local" \
    -e LLM_BASE_URL="http://localhost:11434" \
    -e WORKSPACE_DIR="./shared_workspace" \
    -v /home/seasonbobo/Desktop/simsreal/shared_workspace:/opt/workspace_base \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    -e SANDBOX_TYPE=exec \
    --rm \
    ghcr.io/opendevin/opendevin:main

@evrenyal
Copy link
Author

Similar to what you said, I got a result like this in docker. @SmartManoj

import requests
from litellm import completion

messages = [
    {
        "role": "user",
        "content": "Hello, how are you?"
    }
]

try:
    response = completion(model="ollama/llama2", 
                          base_url="http://172.17.0.2:11434",
                          messages=messages)

    if response.choices:
        print(response.choices[0].message.content)
    else:
        print("No response choices available.")
except requests.exceptions.ConnectionError as e:
    print(f"Failed to connect: {e}")
except Exception as e:
    print(f"An unexpected error occurred: {e}")

root@a9c665dae4f7:/app/opendevin# python test.py

I'm just an AI, I don't have feelings or emotions like humans do, so I can't really feel or respond to greetings like "hello" in the way that a human would. However, I'm here and ready to help answer any questions you may have! Is there something specific you'd like to know or talk about?

When I try it through the UI it appears like this. It doesn't seem to be working properly.

sudo docker run \
    -e LLM_API_KEY="ollama" \
    -e LLM_MODEL="ollama/llama2"  \
    -e LLM_EMBEDDING_MODEL="local" \
    -e LLM_BASE_URL="http://172.17.0.2:11434" \
    -e WORKSPACE_DIR="./workspace" \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -e SANDBOX_TYPE="exec" \
    -p 3000:3000 \
    ghcr.io/opendevin/opendevin:main 
[sudo] password for tardis: 
INFO:     Started server process [1]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:3000 (Press CTRL+C to quit)
INFO:     172.17.0.1:59238 - "GET /index.html HTTP/1.1" 304 Not Modified
INFO:     172.17.0.1:59252 - "GET /assets/index-Ct4OjVSj.js HTTP/1.1" 304 Not Modified
INFO:     172.17.0.1:59264 - "GET /assets/index-BhJ7yrS1.css HTTP/1.1" 200 OK
INFO:     ('172.17.0.1', 59274) - "WebSocket /ws?token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzaWQiOiI4OGI0MmFjMy1mNGRiLTQwYmQtODQ2ZS01NjJhM2QxNjhkOWUifQ.k1vxU4PLQEmXmjFgjKmeapqRttDaxCsdqgswalZwlAc" [accepted]
INFO:     connection open
Starting loop_recv for sid: 88b42ac3-f4db-40bd-846e-562a3d168d9e
INFO:     172.17.0.1:59276 - "GET /locales/en/translation.json HTTP/1.1" 200 OK
14:51:09 - opendevin:INFO: agent.py:145 - Creating agent MonologueAgent using LLM ollama/llama2
14:51:09 - opendevin:INFO: llm.py:25 - Initializing LLM with model: ollama/llama2
14:51:09 - opendevin:INFO: exec_box.py:188 - Container stopped
14:51:09 - opendevin:INFO: exec_box.py:206 - Container started
INFO:     172.17.0.1:59292 - "GET /api/messages/total HTTP/1.1" 200 OK
INFO:     172.17.0.1:59284 - "GET /api/refresh-files HTTP/1.1" 200 OK
INFO:     172.17.0.1:34732 - "GET /api/litellm-models HTTP/1.1" 200 OK
INFO:     172.17.0.1:34748 - "GET /api/agents HTTP/1.1" 200 OK


==============
STEP 0

14:51:23 - PLAN
Hello, how are you?
14:54:42 - ACTION
AgentThinkAction(thought='I should start by searching for information on how to complete my task.', action=<ActionType.THINK: 'think'>)


==============
STEP 1

14:54:42 - PLAN
Hello, how are you?
14:56:05 - opendevin:ERROR: agent_controller.py:178 - Invalid response: no JSON found
14:56:05 - OBSERVATION
Invalid response: no JSON found


==============
STEP 2

14:56:05 - PLAN
Hello, how are you?
14:57:28 - opendevin:ERROR: agent_controller.py:178 - Invalid response: no JSON found
14:57:28 - OBSERVATION
Invalid response: no JSON found


==============
STEP 3

14:57:28 - PLAN
Hello, how are you?
14:58:48 - opendevin:ERROR: agent_controller.py:178 - Invalid response: no JSON found
14:58:48 - OBSERVATION
Invalid response: no JSON found


==============
STEP 4

14:58:48 - PLAN
Hello, how are you?
15:00:48 - ACTION
AgentThinkAction(thought='I should start by searching for information on how to complete my task.', action=<ActionType.THINK: 'think'>)


==============
STEP 5

15:00:48 - PLAN
Hello, how are you?
15:02:06 - opendevin:ERROR: agent_controller.py:178 - Invalid response: no JSON found
15:02:06 - OBSERVATION
Invalid response: no JSON found


==============
STEP 6

15:02:06 - PLAN
Hello, how are you?
15:03:28 - opendevin:ERROR: agent_controller.py:178 - "'action' key is not found in action={'thought': 'How can I find information on how to complete my task?'}"
15:03:28 - OBSERVATION
"'action' key is not found in action={'thought': 'How can I find information on how to complete my task?'}"


==============
STEP 7

15:03:28 - PLAN
Hello, how are you?
15:04:51 - opendevin:ERROR: agent_controller.py:178 - Invalid response: no JSON found
15:04:51 - OBSERVATION
Invalid response: no JSON found


==============
STEP 8

15:04:51 - PLAN
Hello, how are you?
15:06:13 - opendevin:ERROR: agent_controller.py:178 - "'action' key is not found in action={'thought': 'How can I find information on how to complete my task?'}"
15:06:13 - OBSERVATION
"'action' key is not found in action={'thought': 'How can I find information on how to complete my task?'}"


==============
STEP 9

15:06:13 - PLAN
Hello, how are you?
15:07:39 - opendevin:ERROR: agent_controller.py:178 - Invalid response: no JSON found
15:07:39 - OBSERVATION
Invalid response: no JSON found

@spoonbobo
Copy link
Contributor

I am pretty sure with only ollama it's fine for inferencing @evrenyal

@rbren
Copy link
Collaborator

rbren commented Apr 19, 2024

I just triggered 'NoneType' object has no attribute 'request' by having LLM_BASE_URL set, but not having ollama running

Guessing it's an issue connecting to ollama from inside docker--LLM_BASE_URL might need to be host.docker.internal...

@Umpire2018
Copy link
Contributor

Umpire2018 commented Apr 19, 2024

@rbren

I think the problem here is not the connection issue. I think using docker-compose may be a good way to avoid network issue since ollama also have a docker image.

From @evrenyal 's log:

export LLM_API_KEY="ollama"
export LLM_EMBEDDING_MODEL="local"
export WORKSPACE_DIR="./workspace"
export LLM_BASE_URL="http://localhost:11434"
root@33803e5885c6:/app/opendevin# python main.py --task "write a bash script that prints hello"
Running agent MonologueAgent (model: ollama/llama2, directory: None) with task: "write a bash script that prints hello"
12:15:47 - opendevin:INFO: llm.py:25 - Initializing LLM with model: ollama/llama2
12:15:47 - opendevin:INFO: exec_box.py:185 - Container stopped
12:15:47 - opendevin:INFO: exec_box.py:203 - Container started


==============
STEP 0

12:15:48 - PLAN
write a bash script that prints hello

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

12:15:48 - opendevin:ERROR: agent_controller.py:175 - 'NoneType' object has no attribute 'request'
12:15:48 - OBSERVATION
'NoneType' object has no attribute 'request'
sudo docker run \
    -e LLM_API_KEY="ollama" \
    -e LLM_MODEL="ollama/llama2"  \
    -e LLM_EMBEDDING_MODEL="local" \
    -e LLM_BASE_URL="http://172.17.0.2:11434" \
    -e WORKSPACE_DIR="./workspace" \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -e SANDBOX_TYPE="exec" \
    -p 3000:3000 \
    ghcr.io/opendevin/opendevin:main 
[sudo] password for tardis: 
INFO:     Started server process [1]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:3000 (Press CTRL+C to quit)
INFO:     172.17.0.1:59238 - "GET /index.html HTTP/1.1" 304 Not Modified
INFO:     172.17.0.1:59252 - "GET /assets/index-Ct4OjVSj.js HTTP/1.1" 304 Not Modified
INFO:     172.17.0.1:59264 - "GET /assets/index-BhJ7yrS1.css HTTP/1.1" 200 OK
INFO:     ('172.17.0.1', 59274) - "WebSocket /ws?token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzaWQiOiI4OGI0MmFjMy1mNGRiLTQwYmQtODQ2ZS01NjJhM2QxNjhkOWUifQ.k1vxU4PLQEmXmjFgjKmeapqRttDaxCsdqgswalZwlAc" [accepted]
INFO:     connection open
Starting loop_recv for sid: 88b42ac3-f4db-40bd-846e-562a3d168d9e
INFO:     172.17.0.1:59276 - "GET /locales/en/translation.json HTTP/1.1" 200 OK
14:51:09 - opendevin:INFO: agent.py:145 - Creating agent MonologueAgent using LLM ollama/llama2
14:51:09 - opendevin:INFO: llm.py:25 - Initializing LLM with model: ollama/llama2
14:51:09 - opendevin:INFO: exec_box.py:188 - Container stopped
14:51:09 - opendevin:INFO: exec_box.py:206 - Container started
INFO:     172.17.0.1:59292 - "GET /api/messages/total HTTP/1.1" 200 OK
INFO:     172.17.0.1:59284 - "GET /api/refresh-files HTTP/1.1" 200 OK
INFO:     172.17.0.1:34732 - "GET /api/litellm-models HTTP/1.1" 200 OK
INFO:     172.17.0.1:34748 - "GET /api/agents HTTP/1.1" 200 OK


==============
STEP 0

14:51:23 - PLAN
Hello, how are you?
14:54:42 - ACTION
AgentThinkAction(thought='I should start by searching for information on how to complete my task.', action=<ActionType.THINK: 'think'>)


==============
STEP 1

14:54:42 - PLAN
Hello, how are you?
14:56:05 - opendevin:ERROR: agent_controller.py:178 - Invalid response: no JSON found
14:56:05 - OBSERVATION
Invalid response: no JSON found

@spoonbobo
Copy link
Contributor

@rbren

docker run \
    --add-host host.docker.internal=host-gateway \
    -e LLM_API_KEY="ollama" \
    -e LLM_MODEL="ollama/llama3:8b" \
    -e LLM_EMBEDDING_MODEL="local" \
    -e LLM_BASE_URL="http://host.docker.internal:11434" \
    -e WORKSPACE_DIR="./shared_workspace" \
    -v /home/seasonbobo/Desktop/simsreal/shared_workspace:/opt/workspace_base \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    -e SANDBOX_TYPE=exec \
    --rm \
    ghcr.io/opendevin/opendevin:main

I've tried sth like this too, 99 steps in 1 second also

@tpsjr7
Copy link

tpsjr7 commented Apr 19, 2024

@evrenyal

I had an issue with OpenDeviin not reading my config.toml file variables because the browser's local storage had old settings in it that would override whatever I would try to set in the config, and that seemed to also cause the "opendevin:ERROR: agent_controller.py:175 - LLM Provider NOT provided. Pass in the LLM provider you are trying to call" error for me as well.

I was able to overcome that by stopping the app, rechecking my LLM settings, then clearing the local storage in my browser.
And then I think I also hit the gear icon in the bottom left corner to set some settings.

Some combination of those seemed to resolve the error I was getting. In Chrome, right click -> Inspect -> Application tab -> Storage section -> expand Local storage, right click on the entries to clear them.

I think it's a bug (or a feature) that the LLM_MODEL setting is being ignored for whatever is in the browser's local storage since you can set it in the browser with the gear icon,

@rbren rbren changed the title Ollama issues 'NoneType' object has no attribute 'request' Apr 19, 2024
@rbren rbren changed the title 'NoneType' object has no attribute 'request' Ollama: 'NoneType' object has no attribute 'request' Apr 19, 2024
@SmartManoj
Copy link
Collaborator

@spoonbobo, Could you please provide the logs?

@zhonggegege
Copy link

image
The same error as yours, the test local can be connected to the ollama service, and the configuration is correct.

The UI interface is set: ollama/codeqwen:chat

docker run
--add-host host.docker.internal=host-gateway
-e LLM_API_KEY="ollama"
-e LLM_BASE_URL="http://localhost:11434"
-e WORKSPACE_MOUNT_PATH=$WORKSPACE_DIR
-v $WORKSPACE_DIR:/opt/workspace_base
-v /var/run/docker.sock:/var/run/docker.sock
-p 3000:3000
ghcr.io/opendevin/opendevin:main

log------------------------
07:36:15 - opendevin:ERROR: agent_controller.py:178 - 'NoneType' object has no attribute 'request'
07:36:15 - OBSERVATION
'NoneType' object has no attribute 'request'
07:36:15 - opendevin:INFO: agent_controller.py:145 - Task state set to TaskState.PAUSED
07:36:15 - opendevin:INFO: agent_controller.py:109 - Task paused
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

9 participants