您需要先安装一个扩展,例如 篡改猴、Greasemonkey 或 暴力猴,之后才能安装此脚本。
您需要先安装一个扩展,例如 篡改猴 或 暴力猴,之后才能安装此脚本。
您需要先安装一个扩展,例如 篡改猴 或 暴力猴,之后才能安装此脚本。
您需要先安装一个扩展,例如 篡改猴 或 Userscripts ,之后才能安装此脚本。
您需要先安装一款用户脚本管理器扩展,例如 Tampermonkey,才能安装此脚本。
您需要先安装用户脚本管理器扩展后才能安装此脚本。
Collect, filter links, and then copy to your clipboard with a single button click. filters for links containing "(USA)" and excludes "(demo)" and "(kiosk)". Is fairly easy to modify.
I like batch downloading things and not manually doing things myself. Who doesn't like seeing their storage fill with the things they want?(even if they will never use it themselves lol) so I "made" this.
The code is fairly easy to modify. Press the button that shows up on the bottom left and it will throw all the links into your clipboard. Filters for links containing "(USA)" and excludes "(demo)" and "(kiosk)". Throw that into a .txt file and use your favorite downloader to get the files.
For those who don't want to use a downloader with that functionality here is a .bat script for you.
Put this in the same location as a urls.txt file and run the .bat file. This will download all urls in the "urls.txt" file into a folder it creates called "urls.txt downloads" within the same directory.@echo off
:: Define the input file and download folder
set "inputFile=urls.txt"
set "outputDir=%~dp0urls.txt downloads"
:: Check if urls.txt exists
if not exist "%inputFile%" (
echo Error: "%inputFile%" not found in the current directory.
pause
exit /b
)
:: Create the download folder if it doesn't exist
if not exist "%outputDir%" (
mkdir "%outputDir%"
)
:: Download each URL from urls.txt
echo Starting downloads...
for /f "usebackq delims=" %%u in ("%inputFile%") do (
echo Downloading: %%u
curl -L -o "%outputDir%\%%~nxu" "%%u"
)
echo All downloads completed. Files saved in "%outputDir%".
pause