-
Notifications
You must be signed in to change notification settings - Fork 67
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a blacklist to deal with website that already have a preview for inner links #120
Comments
I'll give this one a go |
@mdolr I'm not sure how to work with local storage, how can I work through it? |
I'd suggest reading this and copying what's already in the extension. Also I'm not really sure how this should be implemented as currently we're blocking inner link previews for links that already have a custom implementation (i.e: the youtube integration blocks previews on youtube). How do you plan to do it ? Note the extension shouldn't be disabled completely, only inner links |
A followup on my question, is the API for chrome identical to the Firefox one? My plan was to load an object on startup (I don't know how to load on change) and check if the domain is in the object under Additionally there would be a button in the popup to "add to blacklist" Removing from blacklist would be more difficult, some way to access the object would be needed. I'm thinking of keeping this for later. Let's make a new branch for this feature on the main repo.(might need some sort of scrollable list on popup.html / new page / page for extension options) |
I mean, there is already a setting to disable the extension on a list of pages, (similar to ublock / adblock). Maybe we should hardcode this one ? I think it makes sense to hardcode it. But then if we do that this is equivalent to making a custom implementation for each of those sites. Also yea chrome and firefox are using the same API, normally in firefox you would have to use the browser keyword instead of chrome, but they imported the chrome keyword and all of its function to make it easier to develop extensions so just use the chrome keyword it should be fine. |
Ahh I understand now. If I get you correctly, hardcoding would be writing specific code for each website where we require servol to not work. For blacklisting, I think it's a bad design choice, and the goal should be to move towards lists like ublock/adblock. This discussion makes me confused regarding the requirements of OP. I think functionality for this feature is already implemented. |
It's already kind of implemented, I'm not sure if you're using the extension rn, but it previews github links using meta-data preview, so when you hover a github issue or anything it shows the native github preview + the survol meta data preview. And the goal of this issue is to avoid such things I believe. You might not want to disable the extension completely on Github, but just stopping it form previewing github links specifically when browsing github |
Oh! That's the requirement! PS: I think it's safe to say that when a website does this, it would be only for self-referenced links. Atleast for the time being |
@mdolr got an idea on the implementation I was thinking that using filter in Have added a button as of now, tested and works. Check 9622e45 |
As of 0.5.4 I have changed the link detection system to enhance performances, the extension isn't using gatherHrefs and capturing functions. I do get what you're trying to do and this could easily be implemented with the new sytem. I'm just not sure it should be left to the user as I feel like the meta-data settings is already confusing enough, adding yet another settings for inner links will confuse people even more, also we would have to account for sub-domains, on some site it makes sense to hover some of the inner links and block others. Maybe we should list all of the websites that do not have a custom implementation and implement a custom implementation for each of them even if it consists of using meta-data previewing. It seems like the best thing to do right now I believe. Also I've never seens self-referenced commit like that before, how do you do that ? |
I see what you're saying. I think we can add an "Advanced Settings" expand/collapse, and ship with the extensions some default values for If you're satisfied with the above do tell what implementation you have in mind. For referencing commits like above, simply type the commit hash. For example, this one I picked randomly: a4afdd0. The underlying markdown is below:
|
I'm fine with this honestly if you feel like submitting a PR you can go for it. (Also, I knew about the commit hash, I'm just confused because the commit appears on this repository even though you don't have write permissions, I thought you had bypassed something lol) |
Although I'm still stuck. When doing the above, would |
No you'd have to filter links in the |
Took me a while to make the popup's collapsible. Some ugly force pushes later, the popup works as intended. However, I can't seem to solve the "backend"-ish code. I might be missing something obvious here. I have tried 3 approaches but failed to make the feature work:
I'm a little confused as to how to check in the Again, I implore you to make the workflow more clear and add it to the CONTRIBUTING.MD. Thanks |
Your fork of the repository is not up-to-date with the latest changes which is why you can't understand. I think you should start by deleting your current fork and re-fork the repository, or get even with master in any other way. This is where nodes are detected now, you will have a node object which you can work with, the node object contains a I think all the logic should be handled by the /*
* node - {HTMLNode} - Anchor link HTML node element
* link - {String} - Anchor link href, = node.href
*/
function dispatcher(node, link) {
let domain = getDomain(link); // Get the domain + TLD for a given link without subdomains.
// This is where you should put your filtering system
// before the potentialHover is returned
/* if (!filtered) {*/
let potentialHover = getPotentialHover(node, domain);
/* If we do not support the domain we might not get anything in return of getPotentialHover */
if (potentialHover && potentialHover.bindToContainer != null && node.href && node.href.startsWith('http')) {
potentialHover.bindToContainer(node, domain, container);
container.className = `survol-container ${darkTheme ? 'dark-theme' : ''}`;
window.lastHovered = node;
} else {
// In case the node has no href feed it to the garbage collector
potentialHover = null;
}
/*}*/
} Maybe you should also account for subdomains now that I think about it, so the filter should even be above the domain part / you should use refactor the getDomain function to provide subdomains if asked. I'll try to update the comments in the code with the latest changes and update CONTRIBUTING.MD when I've the time. |
I'll get right to it. That's exactly where I had placed my filter in the very beginning. Thanks! PS: I'll think about tackling the subdomain problem later, I think it works pretty well as is. |
No need to worry about subdomains, I was kinda tired yesterday and that was some total non-sense as subdomains are already accounted for by being removed by the getDomain function, I'll take a look at your PR later |
When using survol on a site that already offers an equivalent system, it should be possible to disable one of the two so that it's not awkward.
I had thought of a global blacklist system, but if you have a better idea, do not hesitate to suggest !
The text was updated successfully, but these errors were encountered: