Discuții » Parere despre Geasy Fork
There should be a universal domain for library scripts and downloadURL/updateURL
Like Github.com -> raw.githubusercontent.com
And what happens when China decides to block this new universal domain?
And what happens when China decides to block this new universal domain?
China has blocked github.com but not raw.githubusercontent.com
Static sites would not get blocked.
update.greasyfork.org is static and got blocked. There's no rule book for what the CCP does.
I would recommend that authors not use Greasy Fork as a CDN for their @require
s but instead use actual CDNs.
CCP blocks the top domain "greasyfork.org" so "update.greasyfork.org" is also get blocked.
Now we have 3 domains - greasyfork.org, sleazyfork.org, cn-greasyfork.org
For sleazyfork vs greasyfork, it is to differentiate the users over 18 or not.
For cn-greasyfork vs greasyfork, it is to differentiate the users from Mainland China or not.
However, when you use the IP counting the hit rates (per user per 24 hr), you should set up a universal domain. There would be no discussion or login features in this domain. This domain is purely generating the CDN-like contents for the scripting.
So the scripts can use the same domain without the concern of domain blocking due to greasyfork.org / sleazyfork.org / cn-greasyfork.org.
This domain should be applied to
// @require https://{{universalDomain}}/scripts/xxxxxx/yyyyyyy/{{LibraryScript}}.js
// @downloadURL https://{{universalDomain}}/scripts/xxxxxx/{{UserJS}}.user.js
// @updateURL https://{{universalDomain}}/scripts/xxxxxx/{{UserJS}}.user.js
https://{{universalDomain}}/
,https://{{universalDomain}}/scripts/
,https://{{universalDomain}}/scripts/xxxx/
would be just a plain blank HTML page.Robots.txt
should be set to deny all searching.This might be also applied to the
XXXX.json
.