Skip to content
This repository has been archived by the owner on Sep 9, 2022. It is now read-only.

Extremely generic element hiding selectors are current performance bottleneck #41

Closed
gorhill opened this issue Jul 3, 2014 · 2 comments

Comments

@gorhill
Copy link
Contributor

gorhill commented Jul 3, 2014

Talking about cosmetic filters here. "Extremely generic" means those selectors which are not id- or class-based (there are handled very well with low overhead), like a[alt="Follow on Facebook"].

When EasyList, EasyPrivacy, Fanboy's Annoyance and Fanboy's Enhanced Tracking are enabled, I count nearly 400 of such selectors.

See what, if anything, can be done to reduce overhead caused by these filters. It's definitely a current hot spot performance-wise.

@gorhill
Copy link
Contributor Author

gorhill commented Jul 4, 2014

Idea: test only against the nodes which have not been tested yet, this means will need to keep the untested nodes around until they are tested. First untested node is document.body. Sounds very promising.

@gorhill
Copy link
Contributor Author

gorhill commented Jul 4, 2014

Testing only changed nodes didn't work too well. Actually, it worked well for the test site I used, quite heavy si.com, but it didn't work well with a lighter site, jshint.com, because a whole lots of nodes are created for this one, and I got the reverse result in this case. So I tried something else, which is to split the generic filters into two groups, the highly generics, and the lowly generics. The lowly generics are the one in the form (regex):

/^(([a-z]*)\[(alt|title)="([^"]+)"\])$/

For examples:

[title="Get Posts On Facebook Wall"]
a[alt="Follow Us on Twitter"]
a[title="Subscribe on FriendFeed"]
img[alt="Share on Twitter"]
img[title="Share at Google+"]

The highly generics are anything else than the above, for examples:

[class^="social_button_"]
a[href^="http://www.adxpansion.com"]
div[id^="div-gpt-ad-"]

A fast dictionary approach can be used for the lowly generics, and for the high generics, same approach as before is used. Nice side-effect aside the ~30% performance gain, less irrelevant rules are likely to be injected (not that this was an issue given the already low likelihood).

queryselector

Above is si.com, reloaded 8 times -- so divide overhead introduced by uBlock seen in the pic by 8 to get average added overhead for top culprit calls (for rather demanding si.com). So before changes, overhead per-page load was 92ms, after it was less than 60ms (for front page of quite demanding si.com that is).

@gorhill gorhill closed this as completed in d043f54 Jul 4, 2014
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant