Display which engines don't use HTTPS #302
Comments
probably we can improve that idea. We could add an information node to every search-engine, which can contain some informations like:
Themes like oscar could display that informations using icons, or hover texts beside the allow/block button. Based on that informations we could also create a plugin which automatically deactivate all non-https engines. |
I like that idea a lot ! |
One way to solve this privacy issue is to disable HTTP requests. Can be implemented with a specific This behavior could be enable with an user setting. |
To bring back this issue from the dead, one idea could : # to import
# results can be harmful, filter are not implemented
HARMFUL_CONTENT='harmful'
# results can be harmful, there is support for content filtering
SAFESEARCH_SUPPORT='support'
# results are for sure harmless
HARMLESS_CONTENT='harmless'
# for each engine
# description .. describes the engine
description={
'categories': ['general'],
'language_support': True,
# should be safesearch_support ?
'safesearch': SAFESEARCH_HARMFUL,
# should be time_range_support ?
'time_range': True,
# should be paging_support ?
'paging': True,
# 'language': 'en',
# informative
'use_api': False,
# forbid HTTP connection for this engine
'allow_http': False,
#how the user can visit the engine without using searx
'url': 'https://www.bing.com/',
'name': 'Bing',
} |
@dalf i really like this idea. It would be nice to add these fields too:
|
How about settings.yml ? bing:
description.use_api: True ? |
This overly complicated over engineered script may help this issue: https://gist.github.com/dalf/3c3904699153a741f27842d8ea30b449 It still requires a manual checks of these fields (all False by default):
Sample output: """
OpenStreetMap (Map)
@website https://openstreetmap.org/
@provide-api yes (http://wiki.openstreetmap.org/wiki/Nominatim)
@using-api yes
@results JSON
@stable yes
@parse url, title
"""
from json import loads
# features
features = {
"categories": ['map'],
"paging": False,
"language": False,
"time_range": False,
"safesearch": False,
"allow_http": False,
"session_required": False,
"multiple_requests": False,
}
# metadata
metadata = {
"website": "https://openstreetmap.org/",
"use_api": True,
"require_api_key": False,
}
# search-url
base_url = 'https://nominatim.openstreetmap.org/'
search_string = 'search/{query}?format=json&polygon_geojson=1&addressdetails=1'
result_base_url = 'https://openstreetmap.org/{osm_type}/{osm_id}'
... |
Correct me if I am wrong .. since f407dd8 we do not have any more http engines. I close the issue, if I am wrong or overseen something, ask for reopen / thanks! |
The connection between the browser and searx can be encrypted, this visible on the browser.
But the connection between searx and a search engine can be in clear text without encryption. When it's possible searx uses https, but some search engines don't support HTTPS :
In other words, anyone on the wire can intercept the request.
Ideas :
The text was updated successfully, but these errors were encountered: