Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: connect ECONNREFUSED 127.0.0.1:3000 #568

Closed
lijom80 opened this issue Apr 2, 2024 · 14 comments
Closed

Error: connect ECONNREFUSED 127.0.0.1:3000 #568

lijom80 opened this issue Apr 2, 2024 · 14 comments
Labels
bug Something isn't working

Comments

@lijom80
Copy link

lijom80 commented Apr 2, 2024

Describe the bug

on the frontend, I am getting this error
ERROR: Failed connection to server. Please ensure the server is reachable at ws://:3001/ws.

at the backend, I am getting the below error
Running the app...

opendevin-frontend@0.1.0 start
vite --port 3001 --host 0.0.0.0

VITE v5.2.7 ready in 578 ms

➜ Local: http://localhost:3001/
➜ Network: http://:3001/
➜ press h + enter to show help
4:38:45 PM [vite] http proxy error: /litellm-agents
Error: connect ECONNREFUSED 127.0.0.1:3000
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1481:16)
4:38:45 PM [vite] http proxy error: /litellm-models
Error: connect ECONNREFUSED 127.0.0.1:3000
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1481:16)
4:38:45 PM [vite] http proxy error: /litellm-models
Error: connect ECONNREFUSED 127.0.0.1:3000
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1481:16) (x2)
4:38:45 PM [vite] http proxy error: /litellm-agents
Error: connect ECONNREFUSED 127.0.0.1:3000
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1481:16)
4:38:45 PM [vite] ws proxy error:
Error: connect ECONNREFUSED 127.0.0.1:3000
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1481:16)

Setup and configuration

Current version:

commit c37124d (HEAD -> main, origin/main, origin/HEAD)
Author: Xingyao Wang xingyao6@illinois.edu
Date: Tue Apr 2 14:58:28 2024 +0800

My config.toml and environment vars (be sure to redact API keys):

cat config.toml
LLM_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxx"
LLM_MODEL="open-devin-preview-gpt4"
WORKSPACE_DIR="./workspace"

My model and agent (you can see these settings in the UI):
Initializing agent (may take up to 10 seconds)... nothing after that

Commands I ran to install and run OpenDevin:

Steps to Reproduce:

  1. the only change I made apart from the instructions provided is in the makefile
    @cd frontend && npm run start -- --port $(FRONTEND_PORT) --host "0.0.0.0" <- added 0.0.0.0 to be accessible over the network.

Logs, error messages, and screenshots:
image

Additional Context

@lijom80 lijom80 added the bug Something isn't working label Apr 2, 2024
@xcodebuild
Copy link
Contributor

Error: connect ECONNREFUSED 127.0.0.1:3000

Looks like you forget to start the backend or mock server at 3000

@huybery
Copy link
Member

huybery commented Apr 2, 2024

#576

@CatalinCiocea
Copy link

Open file opendevin/frontend/vite.config and there corect these lines of code "const BACKEND_HOST = process.env.BACKEND_HOST || "127.0.0.1:3000";

// check BACKEND_HOST is something like "example.com"
if (!BACKEND_HOST.match(/^([\w\d-]+(.[\w\d-]+)+(:\d+)?)/)) {
throw new Error(
Invalid BACKEND_HOST ${BACKEND_HOST}, example BACKEND_HOST 127.0.0.1:3000,
);
}" with those corected ones "const BACKEND_HOST = process.env.BACKEND_HOST || "127.0.0.1:3001";

// check BACKEND_HOST is something like "example.com"
if (!BACKEND_HOST.match(/^([\w\d-]+(.[\w\d-]+)+(:\d+)?)/)) {
throw new Error(
Invalid BACKEND_HOST ${BACKEND_HOST}, example BACKEND_HOST 127.0.0.1:3001,
);
}" - the error is that frontend try to comunicate on 127.0.0.1:3000 and the backend try to communicate on 127.0.0.1:3001 . Afer that save and rerun . It will work ok now .

@foragerr
Copy link
Collaborator

foragerr commented Apr 2, 2024

the only change I made apart from the instructions provided is in the makefile
@cd frontend && npm run start -- --port $(FRONTEND_PORT) --host "0.0.0.0" <- added 0.0.0.0 to be accessible over the network.

Why do you need the backend to be accessible over the network? Are you running backend and front end on different boxes? I suspect your setup will run fine if you dropped --host "0.0.0.0"

If you need to keep it, you need to set the environment variable BACKEND_HOST to your backend server's IP address or DNS.

@JustinLin610
Copy link
Contributor

You should check the log. I guess it is also related to #573

@lijom80
Copy link
Author

lijom80 commented Apr 2, 2024

#576

@JustinLin610 - Yes it is the issue
When I tried wget without sudo, I got
Cannot write to ‘/tmp/llama_index/models--BAAI--bge-small-en-v1.5/snapshots/5c38ec7c405ec4b44b94cc5a9bb96e735b38267a/1_Pooling/config.json’ (Success)
the fact that it prints (Success) at the end could be misleading
with sudo
2024-04-02 21:46:29 (2.78 MB/s) - ‘/tmp/llama_index/models--BAAI--bge-small-en-v1.5/snapshots/5c38ec7c405ec4b44b94cc5a9bb96e735b38267a/1_Pooling/config.json’ saved [190/190]

Success
Starting backend...
INFO: Started server process [12999]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:3000 (Press CTRL+C to quit)

have a question though, do I have to start the back end separately? If so, where can I have a nohup added?

@foragerr
Copy link
Collaborator

foragerr commented Apr 2, 2024

The newest makefile has a run target that runs both backend and frontend for you.

@xcodebuild
Copy link
Contributor

have a question though, do I have to start the back end separately? If so, where can I have a nohup added?

@lijom80 You don't need to, now make run will let you start both the front and back end at the same time and see their outputs simultaneously.

@Francescodotta
Copy link

Francescodotta commented Apr 3, 2024

i've got the same problem with the new make run command, giving me this error:

5:39:57 PM [vite] http proxy error: /litellm-models
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:39:57 PM [vite] http proxy error: /litellm-agents
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:39:57 PM [vite] http proxy error: /litellm-models
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:39:57 PM [vite] http proxy error: /litellm-agents
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:39:58 PM [vite] ws proxy error:
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:40:03 PM [vite] http proxy error: /litellm-models
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:40:03 PM [vite] http proxy error: /litellm-agents
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:40:03 PM [vite] http proxy error: /litellm-models
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:40:03 PM [vite] http proxy error: /litellm-agents
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:40:03 PM [vite] ws proxy error:
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)

@xcodebuild
Copy link
Contributor

Backend always take some time to start, it there any output logs from backend?

@Francescodotta
Copy link

no thanks, for me the issue was some cuda library not properly installed on the wsl.
I solved it.

@lijom80
Copy link
Author

lijom80 commented Apr 4, 2024 via email

@xcodebuild
Copy link
Contributor

My services are running as well. I realised after reboot the node, and running backend again, it installed few more components. Also, the new make run does not run the backend automatically for some reason. I have a question, what is the ideal infra requirements for make it run smoothly?

make build will install dependences to run smoothly.

@yufansong
Copy link
Collaborator

Seems user can run opendevin now. But maye still keep an eye on backend launch slowly problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

8 participants