Download files from flaky websites using wget

less than 1 minute read

I hate slow donwloads. Moreso if I’m download big files (e.g. copying over a huge database from AWS S3. It hasn’t occurred to me until recently that I can use wget for that puprose.

get -O name_of_big_file.format -c

Also, if you can’t be bothered constructing a wget command for each you download, you can use the CurlWget plugin for Google Chrome.