cURL - Download List Of URLs From File - In Windows!

Posted at

cURL does now support reading urls from file,
but you can read the file's content into one line,
than inject the values into the cURL command-line,
effectively downloading the entire list of urls.





the example below will actually append the entire text-content of the URLs into single file,
if you don't want it, and you simply want to download all of the links,
simply remove the --output "output.txt" line.


set URLS=urls.txt
set OUTPUT=output.txt

:: fully qualified short path (fix)
set URLS=%~dp0%URLS%
set OUTPUT=%~dp0%OUTPUT%
for /f %%a in ("%URLS%")do ( set "URLS=%%~fsa" )
for /f %%a in ("%OUTPUT%")do ( set "OUTPUT=%%~fsa" )

:: read file's content (newline separated)
set /p URLS_CONTENT=<"%URLS%"
:: reformat: space-separated, " wrapped.
set URLS_REFORMAT=
for %%i in (%URLS_CONTENT%) do ( set URLS_REFORMAT=%URLS_REFORMAT% "%%i" )

call curl.exe --anyauth ^
--http2 ^
--insecure ^
--ipv4 ^
--location-trusted ^
--output "%OUTPUT%" ^
--ssl-allow-beast ^
--ssl-no-revoke ^
--user-agent "Mozilla/5.0 Chrome" ^
--verbose ^
%URLS_REFORMAT%