Remove tracking parameters and redirect to original URL. This Userscript uses the URL Interface instead of RegEx.
< CleanURLs (Clean URL Improved)についてのフィードバック
Also 'dc' and 'ds'.
Thank you for the kind words Dimitri!
First, instead of
@run document-end
,@run document-start
.
@run document-end
in order to have most of the instances of links (i.e. all links that appears at page load) to be subjected to the functionality of this script.@run document-start
and then add an event listener at page load (i.e. DOMContentLoaded
as I did with Proxy Redirect.instead of
if (url.searchParams.get(blacklist[i]))
, I suggestif (url.searchParams.has(blacklist[i]))
so that if the parameter is empty in the URL, it is still removed.
Thank you. I will test it. Did you test it, yet?
Then again, instead of
windows.history.pushState(null, null, newURL)
, I suggestwidows.location.replace(newURL)
so that the uncleaned link no longer appears in the history.
Good idea!
Also 'dc' and 'ds'.
Are these for "blacklist" or "hash"?
First of all, thank you for your reply.
- Please explain why.
- I have deliberately chose
@run document-end
in order to have most of the instances of links (i.e. all links that appears at page load) to be subjected to the functionality of this script.- I can use
@run document-start
and then add an event listener at page load (i.e.DOMContentLoaded
as I did with Proxy Redirect.
I was suggesting @run-at document-start
to make sure that the URL was cleaned up before the page was loaded, after that I don't necessarily have any other reasons and, in the end, this request was the most minor. This thought came to me after looking at other scripts that have more or less the same function, but for which the authors preferred @run-at document-start
.
instead of
if (url.searchParams.get(blacklist[i]))
, I suggestif (url.searchParams.has(blacklist[i]))
so that if the parameter is empty in the URL, it is still removed.Thank you. I will test it. Did you test it, yet?
Yes, I was able to test it. First in the URL where I detected the problem (initially: the empty parameter didn't clear, but if I added characters, it cleared), then by clearing parameters in other URLs. Now the empty parameters disappear. To be sure that the change only affects URL parameters, I tested an Internet search for a parameter, e.g. "https:// www.google.com/search?q=campaign" (so that the string of characters appears in the URL but is not deleted). So far, no problem.
Then again, instead of
windows.history.pushState(null, null, newURL)
, I suggestwidows.location.replace(newURL)
so that the uncleaned link no longer appears in the history.
This change helped prevent pages from appearing twice in the history, once with the URL uncleared and the other with the URL cleaned.
Are these for "blacklist" or "hash"?
These are blacklist, these params appears one time in Amazon search URL.
Thank you for the elaboration. I will try.
How do you want to be mentioned under metablock collaborator
?
Would アンジルベールディミトリ be preferable or is there an ASCII string you would want instead?
I consider making a new list which would be a strict list for shopping sites, as they have recently began to utilize parameter tag
instead of ref
.
P.S. Please refrain from using domains of Faceboogle et al. I do not like to advertise them.
How do you want to be mentioned under metablock
collaborator
?
You don't have to mention me, but thanks for the suggestion.
Very good script. However, I'd like to suggest a few changes. First, instead of @run document-end, @run document-start. Second, instead of "if (url.searchParams.get(blacklist[i]))", I suggest "if (url.searchParams.has(blacklist[i]))" so that if the parameter is empty in the URL, it is still removed.
Then again, instead of "windows.history.pushState(null, null, newURL)", I suggest "widows.location.replace(newURL)" so that the uncleaned link no longer appears in the history.
Finally, I suggest adding a few parameters: hash: '!psicash' and 'xtor'; blacklist: 'gadid', 'landed', 'Platform' and 'ru'.