Myrient Collect Links with Filters

Collect, filter links, and then copy to your clipboard with a single button click. filters for links containing "(USA)" and excludes "(demo)" and "(kiosk)". Is fairly easy to modify.

Autor
Dethkiller15
Instalaciones diarias
0
Instalaciones totales
7
Calificaciones
0 0 0
Versión
1.0
Creado
18/12/2024
Actualizado
18/12/2024
Tamaño
2.24 KB
Licencia
MIT
Funciona en

I like batch downloading things and not manually doing things myself. Who doesn't like seeing their storage fill with the things they want?(even if they will never use it themselves lol) so I "made" this.

The code is fairly easy to modify. Press the button that shows up on the bottom left and it will throw all the links into your clipboard. Filters for links containing "(USA)" and excludes "(demo)" and "(kiosk)". Throw that into a .txt file and use your favorite downloader to get the files.

For those who don't want to use a downloader with that functionality here is a .bat script for you.

Put this in the same location as a urls.txt file and run the .bat file. This will download all urls in the "urls.txt" file into a folder it creates called "urls.txt downloads" within the same directory.
@echo off
:: Define the input file and download folder
set "inputFile=urls.txt"
set "outputDir=%~dp0urls.txt downloads"

:: Check if urls.txt exists
if not exist "%inputFile%" (
echo Error: "%inputFile%" not found in the current directory.
pause
exit /b
)

:: Create the download folder if it doesn't exist
if not exist "%outputDir%" (
mkdir "%outputDir%"
)

:: Download each URL from urls.txt
echo Starting downloads...
for /f "usebackq delims=" %%u in ("%inputFile%") do (
echo Downloading: %%u
curl -L -o "%outputDir%\%%~nxu" "%%u"
)

echo All downloads completed. Files saved in "%outputDir%".
pause