Greasy Fork is available in English.

Diskussionen » Greasy Fork Rückmeldungen

Whether or not to allow compression of the code of open source projects on Github

§
Veröffentlicht: 23.10.2020

I've created a project on Github that is packaged for use in Tampermonkey using tools like webpack. I'd like to publish the artifacts from the project packaged with npm run build on Greasy Fork, but the community rules don't allow publishing obfuscated code. I wonder:

  • Are such acts are permitted?
  • If so, are there technical means to ensure that the released code is identical to the code compiled by the Github project in order to ensure security?

我在 Github 上创建了一个项目,他使用了 webpack 等工具打包以在 Tampermonkey 中使用。我希望在 Greasy Fork 上发布这个项目由 npm run build 打包得到的脚本,但社区规则不允许发布进行了混淆的代码。我想知道:

  • 这种行为是否被允许
  • 如果是,是否有技术手段保证发布的代码和 Github 项目编译得到的代码是完全一致的,以保证安全性
wOxxOmMod
§
Veröffentlicht: 23.10.2020

This rule against minified/obfuscated code is for the users of the script so they can read and inspect the code when they install it. They won't be compiling it. You can produce the build in dev mode or otherwise customize it to disable minification/mangling.

§
Veröffentlicht: 27.07.2021

Tree Shaking cannot be used in development mode. Moreover, if I use React, webpack needs to generate more than 29,000 lines of code to package React in development mode, because React uses react.development.js instead of react .production.min.js in development mode.
> "By default, React includes many helpful warnings. These warnings are very useful in development. However, they make React larger and slower so you should make sure to use the production version when you deploy the app."

By configuring webpack, I can use react.development.js without compressing other code, which can reduce the number of lines from more than 29,000 to 500, but it will result in compressed code in the code, which violates rule. By the way, because I need to async import, there is no way to @require directly.

I don't think there will be users who read and check more than 29,000 lines of code when installing the script, and this is not just a React's problem, some large npm packages also require Tree Shaking to reduce redundant code and improve performance. Simple scripts only need a js file to be understood by users, but once the script becomes complex, I'm sure users would prefer to go directly to github when reading the code.

So, I was wondering if it would be possible to add an @antifeature rule to prompt users that the script code is compressed and needs to be viewed on github?

---

开发模式下无法使用 Tree Shaking。而且,如果我使用了 React,webpack 在开发模式下需要生成两万九千多行代码来打包 React,因为 React 在开发模式下会使用 react.development.js 而不是 react.production.min.js。
> 「By default, React includes many helpful warnings. These warnings are very useful in development. However, they make React larger and slower so you should make sure to use the production version when you deploy the app.」

通过配置 webpack,我可以在不压缩其他代码的情况下使用 react.development.js,这可以将行数从两万九千多降低到五百,但这会导致代码中出现压缩后的代码,违反了规则。顺带一提,因为我需要 async import 所以没办法直接 @require

我不认为会有用户能在安装脚本时阅读并检查两万九千多行代码,而且这并不是单单 React 的问题,有些大的 npm package 也需要 Tree Shaking 来减少冗余代码并提高性能。简单的脚本只需要单个 js 文件就能让用户理解,但一旦脚本变得复杂,我相信用户在阅读代码时肯定会更愿意直接去 github 上看。

所以,我想知道是否可以增加一个 @antifeature 规则,用来提示用户脚本代码经过压缩,需要在 github 上查看源码?

§
Veröffentlicht: 27.07.2021

Regarding security, if an ordinary script is updated with harmful code, it is certainly also necessary to wait until a victim comes to report it, and only after verification will it be dealt with. In the case of complex scripts, no one will be able to detect the problem by reading the code directly. There is no difference between complex code and compressed code in terms of readability.

So I think it is only necessary to specify that the released code must be the same as the one posted on github. If someone finds that the code is different, they can just report it. Not having the corresponding code on github is also a violation.

---

关于安全性,如果一个普通的脚本被更新了有害的代码,肯定也是需要等到有受害者来举报,在核实之后才会对其进行处理。在脚本比较复杂的情况下,不会有人通过直接阅读代码来发现问题,在可读性上复杂的代码和压缩后的代码没有区别。

所以我认为只需要规定发布的代码必须和github上发布的一样。如果有人发现代码不一样,就可以直接举报了。github上没有对应的代码也算违规。

§
Veröffentlicht: 30.07.2021

Can GreasyFork provide its own compression tool? If you can, you may provide the "multi version installation" function integrating OpenUserJS, that is, developers must submit uncompressed and unobfuscated original code, and GF will automatically compress and generate a minimized version for users; For installation, users can choose to install the minimized version or the original version.
In this case, the 2MB size limit can also be kept, but applies to the GF-minimized version, not the original code.

---

GreasyFork可以提供自己的压缩工具吗?如果可以,或许可以提供融通OpenUserJS那样的"多版本安装"功能,即,开发者必须提交未压缩、未混淆的原版代码,而GF会自动压缩生成最小化的版本提供给用户;用户在安装时可以自己选择安装压缩过的版本,或者是原版。
这样,2MB的代码大小限制也可以保留下来,但改为应用于GF压缩过的版本,而不是开发者提交的原始代码。

§
Veröffentlicht: 30.07.2021
Bearbeitet: 30.07.2021

@PYUDNG

Great idea! I totally agree.

2MB isn't enough for scripts that are really good,popular and useful!

We live in 2021, websites shouldn't have low limits nowadays, if we were in 1990 I would understand it, but I can't get why greasy fork has such a low limit

JAG
§
Veröffentlicht: 31.07.2021

You can "require" github scripts directly.

https://www.jsdelivr.com/?docs=gh

What I wanted to ask is if the "no-minify" rule applies to required libs. I don't think so, as it's not hosted on GF directly. So how much of the features must be in the readable script to be allowed on GF?

once the script becomes complex, I'm sure users would prefer to go directly to github when reading the code.

I could agree with that, but user-scripts are by design single-file and not all scripts are on GH. It's certainly possible to install a user-script directly from GH. Why publish the file on GF if already on GH, one may ask. What's the value for GF? Extending antifeature is an interesting approach if there is an interest to meet in the middle.

That said, I personally think very large user-scripts may be a mistake.

users can choose to install the minimized version or the original version.

Publish the min-version on GH and let users decide which to install. The local script manager don't care much. You can safely set downloadURL to github.

I need to async import, there is no way to @require directly.

I'm looking to understand that need. What is the github link to your project @chjjsjz3 ? If you can make the needed react-bits into separate files they could be loaded from your GH.

Can GreasyFork provide its own compression tool?

jsdelivr seems to have that, but I doubt it can do additional tree shake so it would not solve that problem.

2MB isn't enough for scripts that are really good,popular and useful!

I think there are popular and useful scripts on GF already, so that argument is somewhat weak.

We live in 2021

There are other techniques than user-scripts, you should possibly consider alternatives if your goal is to be modern and not limited by the rules of GF.

§
Veröffentlicht: 31.07.2021

@JAG

I have a script for async import - github.This is a script to watch comics, I can explain why I need async import.In addition, because it was written a long time ago so the code is very messy, I hope not to see the code.

My script creates a node on the page and injects the VUE component, which will then collect the image data of the web page to the VUE component rendering. If a comic site displays all the images directly on the current page, then my script can easily get all the image data, otherwise I need to write separate code to make the script get all the image data.

So my script will run on all pages, running the corresponding code for the adapted pages, and for pages not in the adapted range, the script will just add a "enabling reading mode" option in the menu, The script creates the node and injects the Vue component only after the user clicks the option.

This implementation allows the script to be used not only on comic book sites, but also on most web forums, blogs, SNS, and graph sites directly, without me having to adapt them individually one by one. However, since it will run on all pages, async import must be used to reduce the runtime consumption on unrelated pages.

I am currently running the external library code on the cdn by @resource, and then uses evAl (GM_GETRESOURETEXT ('Name')) to run, on some websites error because of the CSP, but I haven't found this type of website yet :)


Why publish the file on GF if already on GH, one may ask. What's the value for GF?

For me, GF can count the number of installs and check updates of my scripts, which is a feature that github doesn't have.Knowing how many people are using my scripts gives me more motivation to maintain and update them.


That said, I personally think very large user-scripts may be a mistake.

I very much agree with your point of view, the premise is not to calculate the external library.Once you import all kinds of frontend frameworks, third-party components, JSS, state management libraries, the project size will become very large at once. So I very supported Extending Antifeature. This is not bad for GF, but it can make developers easier to develop new scripts to make users more script selection.




我有个用到了 async import 的脚本——github,这是一个看漫画的脚本,我可以解释下我为什么需要 async import。另外因为是很早以前写的所以代码很乱,希望别看代码,我已经在重构了。

我的脚本会在页面上创建一个节点并注入 Vue 组件,之后会收集网页的图片数据交给 Vue 组件渲染。如果一个漫画网站直接把所有图片都显示在当前网页上,那我的脚本可以很简单的拿到所有图片数据,否则就需要我单独去适配来让脚本拿到所有的图片数据。

所以我的脚本会在所有网页上运行,对适配的网页运行对应的代码,而对不在适配范围内的网页,脚本只会在菜单里加一个“进入阅读模式”的选项,用户点击选项后脚本才会创建节点并注入 Vue 组件。

这种实现方式可以使得脚本不仅能在漫画网站上使用,在大部分的网络论坛、博客、SNS、图站上都可以直接使用,而不需要我一个一个的去单独适配。但因为会在所有网页上运行,所以必须用 async import 来减少在无关网页上的运行消耗。

我目前是通过 @resource 获取 CDN 上的外部库代码,再用eval(GM_getResourceText('name'))来运行,在一些网站上会因为 CSP 而出错,不过我目前还没发现这类网站:)


Why publish the file on GF if already on GH, one may ask. What's the value for GF?

对我来说,GF 能统计脚本的安装数和检查更新数,这是 github 所没有的,知道有多少人在用我的脚本能让我更有动力去维护和更新脚本。


That said, I personally think very large user-scripts may be a mistake.

非常同意你的观点,前提是不把外部库也算在内的话。一旦引入前端的各种框架、第三方组件、JSS、状态管理的库,项目体积就会一下子变大非常多。所以我非常支持 extending antifeature,这对 GF 而言没有坏处,却能让开发者更轻松的开发新脚本,让用户有更多的脚本选择。

§
Veröffentlicht: 31.07.2021
Bearbeitet: 31.07.2021
I could agree with that, but user-scripts are by design single-file and not all scripts are on GH. It's certainly possible to install a user-script directly from GH. Why publish the file on GF if already on GH, one may ask. What's the value for GF?

That's the point we should focus to. At this point, I would appreciate Tempermonkey who has a perfect view :

GreasyFork is maybe the most popular userscript hoster. It has many scripts in its inventory and is created by Jason Barnabe, the author of Stylish.

That means that the value of GF are:

  1. For users, it's easy to find a script that meet their need and inspect the code[Value #1]
  2. For developer, it's easy to learn, manage(including develop, pubish, upgrade and delete) and discuss about userscripts[Value #2]

In one word, Greasy Fork is a place for conveniently use and develop good userscripts. All we need is to advance it in all aspects, all features and rules of GF should be working for these values, at least not blocking them.

According these rules, I suggest:

  • Rule no code compression works for [Value #1], so it should be kept
  • Rule 2MB size limitation limits [Value #2], so at least it should be adjusted to fit the rapidly increasing need of code size

What do you think?

---

I could agree with that, but user-scripts are by design single-file and not all scripts are on GH. It's certainly possible to install a user-script directly from GH. Why publish the file on GF if already on GH, one may ask. What's the value for GF?

这正是我们应该关注的关键. 对于这一点,我非常同意Tempermonkey的概括:

GreasyFork继 GreasyFork 之后开始创办。它由 Sizzle McTwizzle 创建,同样地,在其储存库中也拥有大量的脚本资源.

换句话说,GF的价值就在于:

  1. 对于用户而言,他们可以非常简单地寻找并安装合适的脚本,并可以方便地审查代码[价值 #1]
  2. 对于开发者, 他们可以非常方便地学习,、管理(包括 开发、发布、更新和删除)和讨论用户脚本[Value #2]

综上所述, Greasy Fork的价值在于它可以很好的开发和使用用户脚本. 我们所需要做的,就是要将这个价值不断扩大,而所有的规则,都应该为此服务(至少它不应削减GF的价值/阻碍GF的发展).

所以, 我认为:

  • 代码不应压缩 的规则有益于 [价值 #1], 所以应当予以保留
  • 代码小于2MB 削弱或阻碍了 [价值 #2], 所以它至少要调整以适应日渐增长的代码大小需求

你们怎么想?

§
Veröffentlicht: 01.08.2021

For users, it's easy to find a script that meet their need and inspect the code[Value #1]

Rule no code compression works for [Value #1], so it should be kept

I don't want to repeat the problems I encountered,URL

TL;DR:

  • I must use async import
  • I must use React
  • I have to comply with the rules of Greasy Fork

On top of this:

  • I want users to be able to use the production version of React instead of the development version, They don't waste runtime consumption on features that they can't use in the development version, They don't add review costs due to redundant code.
  • I wish developers could use Tree Shaking to remove redundant code, Same reason as above
  • I don't want the user to have to review tens of thousands of lines of code just because the developer uses React or some other external library

Above three points limits [Value #1]

---

For users, it's easy to find a script that meet their need and inspect the code[Value #1]

Rule no code compression works for [Value #1], so it should be kept

我不想再重复说明我遇到的问题,URL

TL;DR:

  • 我必须使用 async import
  • 我必须使用 React
  • 我必须符合 Greasy Fork 的规则

在这之上:

  • 我希望用户可以使用生产版本的 React 而不是开发版本的,不会因为那些他们用不上的开发版本的功能而浪费运行时消耗,不会因为增加了冗余代码而增加审查成本
  • 我希望开发者可以使用 Tree Shaking 删除冗余代码,理由同上
  • 我不希望用户必须在几万行的代码中审查代码,就只因为开发者用了 React 或其他的什么外部库

以上三点削弱或阻碍了 [价值 #1]

§
Veröffentlicht: 01.08.2021

I am not against the rule of "prohibiting compression code". On the contrary, I agree with it. Just in some cases this rule obviously cannot achieve the purpose of it should achieve.

---

我并不是反对“禁止压缩代码”这一规则,相反我非常认同它。只是在某些情况下这条规则显然不能达到它本应该达到的目的。

§
Veröffentlicht: 01.08.2021

As I understand it, the main problem is that some scripts are built with a bunch of npm packages, and it's the inclusion of these packages rather than the "original content" of the script that taking it's over the limit.

I'm not an expert on JS packaging, but I wonder:

1. Are there build settings that would have the packages included from a CDN rather than included in the script?
2. Does Tampermonkey or others have any special logic to deal with included packages?
3. Is the output of the build process such that a block of code could be read by Greasy Fork and determined to be a specific package? For example, if you included react in your script, could Greasy Fork detect that and hide it by default when viewing the source?

§
Veröffentlicht: 01.08.2021
Bearbeitet: 01.08.2021

At least for me the problem isn't with NPMs, the problem is the script itself that when it has about 30.000 lines in it's code the script gets automatically "banned" of greasyfork since it can't be updated again if more code lines were included just because it will end up being above 2MB.

Amazing, helpful and useful scripts, the most popular one's are usually huge, recently MAL-Sync got deleted from GF because it had to used minification to be able to keep being updated, since it has 31,233 lines in it's code it is bigger than 2MB without minification. Now the devs of it are having a hard time trying to figure out how to compress without minifying the +30.000 lines of code this script has.
https://github.com/MALSync/MALSync/releases/latest/download/malsync.user.js

Another script that's popular and seems like will soon reach the needlessly low 2MB limit is this one https://greasyfork.org/en/scripts/33522-super-preloaderplus-one-new/code
This is just an example of how scripts can get to more than 30.000 and likely have more than 2MB.

The limit should be increased.

§
Veröffentlicht: 01.08.2021

I find it hard to believe that there's any script out there that has 2MB of original JavaScript. Looking through your examples:

https://github.com/MALSync/MALSync/releases/latest/download/malsync.user.js

This seems to include Vue.js and a bunch of CSS.

https://greasyfork.org/en/scripts/33522-super-preloaderplus-one-new/code

Multiple included libraries.

It's not wrong to do these things but it's important to understand the reason why scripts are hitting the limit so we can come up with the best solution.

§
Veröffentlicht: 02.08.2021
Bearbeitet: 02.08.2021

@JasonBarnabe

Thanks for the reply!
Yes I agree with that too.

If it's possible to increase the limit, it could help the devs of MAL-Sync and probably everyone else that commented on this topic at least.
I think the only thing that would make MAL-Sync and scripts heavier than 2MB possible to be on greasyfork would be increasing the limit of MBs a script on GF can have.
But I'm okay if I'm totally wrong and there are better ways to do it, I'm okay having anything implemented that could allow MAL-Sync and more scripts to be on GF.

JAG
§
Veröffentlicht: 06.08.2021

My engagement in this topic is purely out of interest. I have no specific agenda, but basically I agree that any script 2Mb would be hard to review already and if the problem is bloated libs they should be kept separately.

@JasonBarnabe I'm not an expert on JS packaging, but this is my take (referring to webpack, but there are probably similar options in the alternatives):

  1. Are there build settings that would have the packages included from a CDN rather than included in the script?

The basic way is to just include them as externals though "require". https://webpack.js.org/configuration/externals/

One problem people report with that approach is that they don't want to use the big "official dist" for libs (e.g. react), but tree-shake it to load and parse only the relevant parts. I think this can be achieved by code-splitting and then publishing the "shaken-for-my-project" lib for download (through require) on e.g. github.

https://webpack.js.org/guides/code-splitting/

I'm thinking the rules might even be ok with minifying (compressing) those libs as long as they are version controlled and referring to a fairly stable version. Is it? Would it help if the non-minified version is published and jsDelivr does the minifying?

Another complaint seems to be that they don't want to use "require" but dynamic loading (async). I haven't seen this explained fully, but for me that would still mean that you want the libs separate from the main script. This is a slightly separate question, but the guidelines around what is allowed and how to do it the best way could be improved though the same discussion. It would be nice if there was a standard way to mark a "require" as async, but that would be a matter for the loader more than for GF.

  1. Does Tampermonkey or others have any special logic to deal with included packages?

I'm not sure what you're looking for? Could you explain how that would relate to the problem? TM is not open source, but perhaps the maintainer could be interested in this topic if we have ideas.

Reading the docs I'd say there are some logic to consider. https://www.tampermonkey.net/documentation.php?ext=dhdg#_require

  1. required libs are prepended to the user-script and the entire thing runs in a sandbox kind of way
  2. resource is cached locally and updated separately from running the script
  3. #sha256=23456 - Instructions to TM can be appended to the URL
  4. tampermonkey: - Some libs are already loaded and can be reused by scripts (not so sure about this part)

Since resource is already local it should not be that bad for performance, but it is still loaded and parsed each time. Some seem to indicate that the biggest problem with static require is when needlessly loading on (sub)pages where the script would not be useful at all. Perhaps these hints could help reduce that problem. https://www.tampermonkey.net/faq.php?ext=dhdg#Q401 There might be possible extensions to the "exclude" mechanism, but this would be another discussion.

  1. Is the output of the build process such that a block of code could be read by Greasy Fork and determined to be a specific package? For example, if you included react in your script, could Greasy Fork detect that and hide it by default when viewing the source?

It probably is possible, and it might help review of the code, but I don't think it would be a reason to question the current limit. And it would be hard for GF to verify that the hidden code was indeed safe (and equivalent to the official code) so it would possibly be worse than not guiding the user.

Existing bloated scripts that are close to deletion have probably not been actively reviewed as is, so giving them a case-by-case exemption I think would be a simpler way to deal with it. If they have a github with well-maintained source code it goes a long way for me. Getting in touch with such a script and discussing why it's not possible to "require" the libs would be a better start though.

§
Veröffentlicht: 13.08.2021
The basic way is to just include them as externals though "require". https://webpack.js.org/configuration/externals/

Thanks for the info, I've added this as a suggestion in the rules.

I'm thinking the rules might even be ok with minifying (compressing) those libs as long as they are version controlled and referring to a fairly stable version. Is it? Would it help if the non-minified version is published and jsDelivr does the minifying?

A minified version on a CDN is acceptable.

I'm not sure what you're looking for? Could you explain how that would relate to the problem? TM is not open source, but perhaps the maintainer could be interested in this topic if we have ideas.

Just whether TM "folds" known libraries. Sounds like no.

Antwort schreiben

Anmelden um eine Antwort zu senden.