Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use http.DefaultClient #2530

Merged
merged 1 commit into from
Feb 20, 2024
Merged

use http.DefaultClient #2530

merged 1 commit into from
Feb 20, 2024

Conversation

mxyng
Copy link
Contributor

@mxyng mxyng commented Feb 16, 2024

default client already handles proxy: https://pkg.go.dev/net/http#RoundTripper

default client already handles proxy
Copy link
Contributor

@BruceMacD BruceMacD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any suggestions for testing this locally?

@mxyng
Copy link
Contributor Author

mxyng commented Feb 16, 2024

Any suggestions for testing this locally?

The easiest way is to run the mitmproxy docker image, expose 8080 which you set to HTTPS_PROXY. The challenge is it uses a self signed cert so extracting and installing that cert so ollama uses it (without adding it to the system) is kind of annoying. I haven't gotten around to testing this and am blindly trusting the docs

@eldondevat
Copy link

I can confirm this works with a socks5 proxy now. Thank you!

Copy link
Member

@jmorganca jmorganca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good cleanup!

@jmorganca jmorganca merged commit 897b213 into main Feb 20, 2024
13 checks passed
@jmorganca jmorganca deleted the mxyng/http-default-client branch February 20, 2024 23:34
jimscard added a commit to jimscard/ollama that referenced this pull request Feb 21, 2024
* 'main' of https://github.com/jimscard/ollama: (147 commits)
  update llama.cpp submodule to `c14f72d`
  Update big-AGI config file link (ollama#2626)
  add `dist` directory in `build_windows.ps`
  update llama.cpp submodule to `f0d1fafc029a056cd765bdae58dcaa12312e9879`
  better error message when calling `/api/generate` or `/api/chat` with embedding models
  Support for `bert` and `nomic-bert` embedding models
  Update faq.md
  replace strings buffer with hasher (ollama#2437)
  add gguf file types (ollama#2532)
  use http.DefaultClient (ollama#2530)
  update llama.cpp submodule to `66c1968f7` (ollama#2618)
  Add Page Assist to the community integrations (ollama#2447)
  docs: add Msty app in readme (ollama#1775)
  Update README.md to include Elixir LangChain Library (ollama#2180)
  [nit] Remove unused msg local var. (ollama#2511)
  docs: add tenere to terminal clients (ollama#2329)
  Update import.md
  Add ShellOracle to community terminal integrations (ollama#1767)
  Update faq.md
  feat: add Helm Chart link to Package managers list (ollama#1673)
  ...
zhewang1-intc pushed a commit to zhewang1-intc/ollama that referenced this pull request May 13, 2024
default client already handles proxy
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants