New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
More annoying URL parameters cleanup + loop cleanup #133
Comments
Copying comments from sebsauvage#168:
TL;DR I was proposing to remove URL cleanup features from Shaarli, and moving this to a userscript. It is true that this is less useful for the average user. @alexisju has more (tested) URL filters to propose. |
Not sure I understand exactly what you mean by this, @nodiscc. To make that code more manageable, it would make sense to have the strings in a list or array, and then loop over that array to call Note that the current code drops everything that would come after the pattern, which seems to be fine in most cases (the pattern is added at the end) but is definitely not the ideal solution. So as long as we're discussing these patterns, we should look into stripping only the keyword and it's associated value (something like searching for the pattern between (?, & or #) and =, and remove the value between the = and the next & (or end of string, if no & is found). But this could be done in two steps/PRs: first loop over array and include the new strings, then later refactor code to only strip relevant additons while leaving potential arguments that are part of that URL itself. All this (I should not have turned the |
Ok this is what I came to think too (hence "this is less useful for the average user"). Let's refactor the URL cleaning code to use looping over an array, and add @alexisju's suggestions. Cleaning only the relevant part of the URL can be done in a later PR. |
Created separate issue #136 for the selective clean-up (after describing this further, I'm not 100% sure we need to do that, please discuss there). |
https://github.com/mro/Shaarli/commit/0e0771ff9d9ab2cbe0939bb20c0104cee8f49839 that's how I do it in my installation. |
Relates to #141 Relates to #133 Modifications - move URL cleanup to `application/Url.php` - rework the cleanup function - fragments: `#stuff` - GET parameters: `?var1=val1&var2=val2` - add documentation (APIs the params belong to) - add test coverage Reference - http://php.net/parse_url - http://php.net/manual/en/language.oop5.magic.php#language.oop5.magic.tostring Signed-off-by: VirtualTam <virtualtam@flibidi.net>
Dans index.php, j'ai ajouté ceci pour nettoyer diverses url (ajouts de facebook, scoopit, et divers utm...) : ça fonctionne bien même si cette liste n'est pas exhaustive...
$i=strpos($url,'?fb_'); if ($i!==false) $url=substr($url,0,$i);
$i=strpos($url,'?__scoop'); if ($i!==false) $url=substr($url,0,$i);
$i=strpos($url,'#tk.rss_all?'); if ($i!==false) $url=substr($url,0,$i);
$i=strpos($url,'?utm_campaign='); if ($i!==false) $url=substr($url,0,$i);
$i=strpos($url,'?utm_medium='); if ($i!==false) $url=substr($url,0,$i);
The text was updated successfully, but these errors were encountered: