Skip to content
Cimbali edited this page Mar 26, 2020 · 4 revisions

What to do when a website is broken

Occasionally, CleanLinks will break websites, because the embedded URL detection is automatic, and a legitimate use case might be missing from the rules.

⚠️ Please keep in mind that CleanLinks always has been (and will remain for the foreseeable future) for advanced users − that can tolerate sites breaking occasionally, and fixing them. Think uMatrix more than uBlock Origin. Making good default rules has to be a community effort: even if I wanted, I couldn’t possibly keep up with all the websites. Thus CleanLinks allows rules to be edited directly, without needing to wait for a rules or add-on update.

Is there a website that is not working anymore? Maybe reloading infinitely? Here’s what you can do:

1. Whitelist the URL

  1. Open the CleanLinks menu by clicking on the toolbar button toolbar button,
  2. Select the problematic link in the history,

    💡 To reduce the potential candidates, click the trash icon trash icon to empty the list, then the reload button trash icon to reload the page.

  3. To either allow or remove this embedded URL at all following times this link is loaded, respectively
    • click the “Whitelist Embedded URL” button whitelist icon, or
    • click the “Remove Embedded URL” button blacklist icon.

💡 Alternately, you can also

  • manually edit the rule in the preferences. The edit rule button opens the preferences with the editor pre-populated with the selected link.
  • allow the link to be loaded once without any cleaning, by using the Whitelist once button
  • allow all the requests on the page to be loaded without cleaning, by toggling the add-on off , refreshing the page , and toggling the add-on back on after the page has loaded.

2. Consider contributing the cleaning/whitelisting rule

⚠️ This is important as CleanLinks has no telemetry at all, not even anonymous.

Do you think the website is used by many people, or could be useful to the wider community as a default rule? Please open an issue and I’ll try to integrate it. What I need to know is:

  • on which pages does the problem happen?
  • which parameters should be removed or whitelisted for the website to work?

💡 You can search for the rule by filtering by website in CleanLink’s configuration page, or in the rules file that you can export from that page.

💡 You can copy a dirty/clean link pair from the CleanLinks popup by selecting it and using Ctrl + C (resp. Cmd + C on macOS).

How do I completely whitelist a website?

This is not recommended, it is better to identify which parameters may contain tracking values or embedded URLs and to selectively remove and/or whitelist those.

Nevertheless, if you want to whitelist a domain fully, edit the rule as follows:

  • matching URLs:
    • domain: the desired domain
    • path: leave empty
  • Cleaning Actions:
    • Whitelist query parameters: add .*
    • Allow URL inside path: check the box