Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem with (and possibly other search engines): timeout #1738

x-0n opened this issue Nov 5, 2019 · 0 comments


Copy link

@x-0n x-0n commented Nov 5, 2019


When running a search, fails for me. In the results page, I get notified that

"Engines cannot retrieve results: (timeout)".

in /var/log/uwsgi/uwsgi.log it looks like this:

DEBUG:urllib3.connectionpool:Starting new HTTPS connection (2): timeout: : HTTP requests timeout(search duration : 8.756943225860596 s, timeout: 8.0 s) : ReadTimeout

In my settings.yml, I have the following settings:

  - name :
    engine : xpath
    search_url :{query}&what=stories&order=relevance
    results_xpath : //li[contains(@class, "story")]
    url_xpath : .//span[@class="link"]/a/@href
    title_xpath : .//span[@class="link"]/a
    content_xpath : .//a[@class="domain"]
    categories : it
    shortcut : lo
    timeout : 10.0

But checking prefs from web UI looks like this:

Expected result

  • should respect the timeout set in settings.yml and thereby not overrun the max timeout

Steps to reproduce

  • Activate in prefs
  • run any search query


  • docker image searx/searx:latest (0.15.0-186-42d5e2c0)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet
1 participant
You can’t perform that action at this time.