Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Langchain nodes not respecting HTTP proxy settings #10901

Open
rasmusson opened this issue Sep 20, 2024 · 12 comments · May be fixed by #13678
Open

Langchain nodes not respecting HTTP proxy settings #10901

rasmusson opened this issue Sep 20, 2024 · 12 comments · May be fixed by #13678
Labels
in linear Issue or PR has been created in Linear for internal review

Comments

@rasmusson
Copy link

Bug Description

When using a Langchain models like OpenAI and Gemini, HTTP calls going to the models online does not use the HTTP proxy defined for N8N. But go straight to the internet.

I found this as my N8N is not allowed to access internet directly. Allowing direct access resolves the issue

To Reproduce

  1. Setup n8n to use a HTTP proxy using environmental variables HTTP_PROXY and HTTPS_PROXY.
  2. Turn off access to internet except through proxy
  3. Set up and run a simple AI flow using chat trigger, conversational agent and OpenAI model.

Expected behavior

Preferably, the AI model nodes should use the proxy defined for n8n. If not there should be a possibility to configure proxy separately for these nodes

Operating System

Docker

n8n Version

1.59.3

Node.js Version

Provided by docker image

Database

SQLite (default)

Execution mode

main (default)

@Joffcom
Copy link
Member

Joffcom commented Sep 20, 2024

Hey @rasmusson,

We have created an internal ticket to look into this which we will be tracking as "GHC-244"

@Joffcom Joffcom added the in linear Issue or PR has been created in Linear for internal review label Sep 20, 2024
@Joffcom
Copy link
Member

Joffcom commented Sep 24, 2024

Hey @rasmusson,

I suspect this is not specific to the Langchain nodes and is likely linked to another general issue that we have open where Axios the http package we use doesn't pick up system defined proxies correctly if it is an HTTP proxy due to a failure on the CONNECT method.

@rasmusson
Copy link
Author

Ok. As I understand the axios issue resulted in bad gateway error. In my case in the langchain nodes I dont get any error, the request just go straight to internet without going through the proxy. The node works just fine if I allow direct internet access but keep the system defined proxy.

@Joffcom
Copy link
Member

Joffcom commented Sep 25, 2024

Hey @rasmusson,

That sounds a bit different, How are you setting the system proxy? Assuming http_proxy and https_proxy are being set it should still work unless Langchain itself uses an http library that doesn't respect system set proxies.

@rasmusson
Copy link
Author

yes, http_proxy and https_proxy env vars. Im using information extractor together with open AI model

@Joffcom
Copy link
Member

Joffcom commented Sep 27, 2024

Perfect, I have moved this to our AI team to investigate. It looks like Langchain may have an issue with proxies.

@pemontto
Copy link
Contributor

pemontto commented Oct 2, 2024

Same issue here. Disabling http_proxy and https_proxy on an internet connected host proves the AI nodes fail when it's set. Unfortunately not an option for hosts that require a proxy

@pemontto
Copy link
Contributor

pemontto commented Dec 3, 2024

This seems even more broken in recent versions:
In 1.50.2 I can at least run AI model nodes with a firewall rule bypassing the proxy 😢 (the rest of the traffic correctly goes via the proxy)
image

However in 1.69.2 the AI Agent never seems to call the chat model, it will sit there indefinitely. Even debug logging gives no indication why. I've tried adding the model base url to no_proxy but it still fails. The only way it will work is by removing http_proxy and https_proxy entirely... which means we can have AI and nothing else, or everything else but not AI 😖
image

Debug logging from 1.69.2
image

@alparo
Copy link

alparo commented Jan 19, 2025

I confirm. Even HTTP Request node doesn't work with globally set proxy:

    environment:
...
      - HTTP_PROXY=http://v2raya:20171
      - HTTPS_PROXY=http://v2raya:20171

I get:

Service unavailable - try again later or consider setting this node to retry automatically (in the node settings)
503 - ""

n8n version: 1.74.3

@pemontto
Copy link
Contributor

@alparo that doesn't seem like an error reaching the proxy. Confirming I'm on 1.74.3 with these in the environment

http_proxy=http://proxy.example.com:3128
https_proxy=http://proxy.example.com:3128

And my normal HTTP requests are going via the proxy successfully... albeit not using HTTP CONNECT (another axios issue)

Image

@jasonliusz
Copy link

jasonliusz commented Mar 2, 2025

Currently v1.80.5 still repro this issue.
I use Charles as Proxy to inspect the payload, google mail node can work via Charles proxy, but ai agent node can't.
Both nodes can get data correctly, just the later one cannot using the proxy.
Through Charles, I can see that google mail api is called through User Agent AXIOS, it hinted that ai agent may not use the same http connection tunnel.

@jasonliusz
Copy link

jasonliusz commented Mar 4, 2025

I investigated through the source code, confirm that open ai node does NOT respect any http_proxy setting, when it call langchain's openai client.
The core reason happens, because HTTP "fetch" api does not respect http_proxy. It is required to set the Proxy Agent explicitly. Other HTTP nodes works because they are using AXIOS for HTTP invoking instead.

I have fixed this issue in this commit, which is appled for a pull.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
in linear Issue or PR has been created in Linear for internal review
Projects
None yet
5 participants