Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

connect ETIMEDOUT 31.13.70.33:443 #79

Closed
583175694 opened this issue Mar 26, 2023 · 10 comments
Closed

connect ETIMEDOUT 31.13.70.33:443 #79

583175694 opened this issue Mar 26, 2023 · 10 comments
Labels
bug Something isn't working

Comments

@583175694
Copy link

Describe the bug

When I npm run dev and request the interface, this error is reported, and I can't ping 31.13.94.36, how should I solve this problem?

To Reproduce

1.npm install
2.npm run dev
3.connect ETIMEDOUT 31.13.70.33:443

OS

No response

Node version

No response

@583175694 583175694 added the bug Something isn't working label Mar 26, 2023
@jk-zhang
Copy link

I had the same problem

Error with OpenAI API request: connect ETIMEDOUT 31.13.90.19:443

@Abandon99
Copy link

if you are in China, may this can help you:
#78

@JinliG
Copy link

JinliG commented Mar 28, 2023

If you are in China and have a available proxy, you can resolve this prolem by this:

openai.createCompletion({ ...params }, {
          httpsAgent: tunnel.httpsOverHttp({
                    proxy: {
                              host: 'xx.xx.xx',
                              port: xx,
                    }
          })
});

@lpbottle
Copy link

lpbottle commented Apr 1, 2023

If you are in China and have a available proxy, you can resolve this prolem by this:

openai.createCompletion({ ...params }, {
          httpsAgent: tunnel.httpsOverHttp({
                    proxy: {
                              host: 'xx.xx.xx',
                              port: xx,
                    }
          })
});

I have tried to use your solution to solve the problem, but it didn't work and still returned 'An error occurred during your request'.

@wang-xiaowu
Copy link

wang-xiaowu commented Apr 2, 2023

If you are in China and have a available proxy, you can resolve this prolem by this:

openai.createCompletion({ ...params }, {
          httpsAgent: tunnel.httpsOverHttp({
                    proxy: {
                              host: 'xx.xx.xx',
                              port: xx,
                    }
          })
});

it worked, you can find it in my project: https://github.com/behappy-project/behappy-chatgpt-assistant/blob/main/lib/openai.js

const {Configuration, OpenAIApi} = require("openai");
const tunnel = require('tunnel');
const axios = require('axios');
exports.openai = (opts = {}) => {
    const configuration = new Configuration({
        apiKey: opts.apiKey,

    });
    const client = axios.create({
        httpsAgent: tunnel.httpsOverHttp({
            proxy: {
                host: '127.0.0.1',
                port: 7890,
            }
        })
    });
    const openai = new OpenAIApi(configuration, configuration.basePath, client);
    return async (ctx, next) => {
        ctx.openai = openai;
        await next();
    };
};

@fredmanxu
Copy link

Don't forget to npm install tunnel

@2019geguangpu
Copy link

After config the proxy,i got an error. Here's the error 'Error with OpenAI API request: tunneling socket could not be established, cause=socket hang up'

@2019geguangpu
Copy link

After config the proxy,i got an error. Here's the error 'Error with OpenAI API request: tunneling socket could not be established, cause=socket hang up'
It seems like that i must config username and password. But, that's my guess.

@2019geguangpu
Copy link

I got it. I config the host correctly, but i mistakenly config the port. I configured the port with 3000 which is not my Great Firewall software's port.

@Zheap
Copy link

Zheap commented Apr 19, 2023

If you are in China and have a available proxy, you can resolve this prolem by this:

openai.createCompletion({ ...params }, {
          httpsAgent: tunnel.httpsOverHttp({
                    proxy: {
                              host: 'xx.xx.xx',
                              port: xx,
                    }
          })
});

it worked, you can find it in my project: https://github.com/behappy-project/behappy-chatgpt-assistant/blob/main/lib/openai.js

const {Configuration, OpenAIApi} = require("openai");
const tunnel = require('tunnel');
const axios = require('axios');
exports.openai = (opts = {}) => {
    const configuration = new Configuration({
        apiKey: opts.apiKey,

    });
    const client = axios.create({
        httpsAgent: tunnel.httpsOverHttp({
            proxy: {
                host: '127.0.0.1',
                port: 7890,
            }
        })
    });
    const openai = new OpenAIApi(configuration, configuration.basePath, client);
    return async (ctx, next) => {
        ctx.openai = openai;
        await next();
    };
};

thks, it works

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

10 participants