Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

403 OPTIONS "/api/generate" #2

Closed
Chillance opened this issue Oct 17, 2023 · 11 comments
Closed

403 OPTIONS "/api/generate" #2

Chillance opened this issue Oct 17, 2023 · 11 comments
Assignees

Comments

@Chillance
Copy link

Hi!

Just tested this and noticed it does a OPTIONS "/api/generate" request. From what I can tell this doesn't exist in latest ollama code...

Thoughts?

@tjbck tjbck self-assigned this Oct 18, 2023
@tjbck
Copy link
Contributor

tjbck commented Oct 18, 2023

Hi there,

Browsers often send an OPTIONS request to verify CORS, which might be the case here. Were there any other issues caused by the OPTIONS request?

Thanks.

@Chillance
Copy link
Author

Well, I get that asking something trying to generate something, but it seems to stop there so nothing else happens.

@tjbck
Copy link
Contributor

tjbck commented Oct 18, 2023

Could you please provide any specific error messages or logs that you encounter during the process? This information would greatly assist me in diagnosing the problem more accurately and providing you with the appropriate guidance.

Thanks.

@Chillance
Copy link
Author

How do I get better logs? I tried typing something, pressed enter, and then nothing.

@tjbck
Copy link
Contributor

tjbck commented Oct 19, 2023

Hi there,

To obtain more detailed logs, you can use the following command for Docker:

docker logs ollama-webui

Additionally, it would be helpful for diagnosing your issue if you could provide a screenshot of your console logs from your browser's developer tools. This will allow us to examine any client-side errors or issues that might not be visible in the server logs.

Please feel free to share the Docker logs and the browser console logs screenshot, and we'll do our best to assist you in resolving the problem.

Thanks.

@coolaj86
Copy link
Contributor

@Chillance Part of the reason that browsers invented CORS Preflight Requests (the OPTIONS issue) is to prevent people from stumbling into security issues.

Do you already have the access protected with an API token or HTTP Basic Auth?

Check out ollama/ollama#849 (comment) and the CORS section at https://webinstall.dev/caddy.

@coolaj86
Copy link
Contributor

coolaj86 commented Oct 21, 2023

Tested, Working Example

See #10

@Chillance
Copy link
Author

I don't do anything particular but just staring the ollama serve. And, on the same machine I run this in docker.

docker run --network=host -p 3000:3000 --name ollama-webui --restart always ollama-webui

I actually got chatbot-ollama (other repo) working fine. But here I can see this in the console log:

e87e0c1f-4d67-4015-959a-0e2b59659483
2.fb1b6367.js:52 submitPrompt
192.168.1.11/:1 Access to fetch at 'http://192.168.1.11:11434/api/generate' from origin 'http://192.168.1.11:3000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
start.93b882e2.js:1

   POST http://192.168.1.11:11434/api/generate net::ERR_FAILED

window.fetch @ start.93b882e2.js:1
R @ 2.fb1b6367.js:52
await in R (async)
re @ 2.fb1b6367.js:58
start.93b882e2.js:1

   Uncaught (in promise) TypeError: Failed to fetch
at window.fetch (start.93b882e2.js:1:1402)
at R (2.fb1b6367.js:52:108120)

@Chillance
Copy link
Author

And
docker logs ollama-webui
only returns:
Listening on 0.0.0.0:3000

@tjbck
Copy link
Contributor

tjbck commented Oct 22, 2023

Your logs confirm the CORS error as expected, you should pull the latest commit of the main branch and build the docker container. It introduces breaking changes so your command should be replaced with the following instead:

docker build --build-arg OLLAMA_API_BASE_URL='' -t ollama-webui .
docker run -d -p 3000:8080 --name ollama-webui --restart always ollama-webui

Also make sure to run the following command to serve Ollama, as mentioned here, which should solve your issue:

OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=* ollama serve

Thanks.

@Chillance
Copy link
Author

Thanks! Works now!

tjbck pushed a commit that referenced this issue Jan 1, 2024
Added nodeSelectors for allocating GPU nodePools in the cloud and configured volumes for WebUI
explorigin pushed a commit to explorigin/open-webui that referenced this issue Feb 2, 2024
Added nodeSelectors for allocating GPU nodePools in the cloud and configured volumes for WebUI
tjbck pushed a commit that referenced this issue Apr 29, 2024
German locale -  ran i18n parser
tjbck pushed a commit that referenced this issue May 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants