Depending upon server/workstation OS's ...
Rsync and/or sftp and/or lftp will supposedly 'resume download'.
rsync -e ssh <myUser>@<myServer>:/<pathToRemoteFile>/<fileName> /<pathToLocalFile>/<fileName>
sftp user@server.ip
reget yourfile.name
quit
http://lftp.yar.ru/lftp-man.html
I have a mac workstation ... linux servers ... I sometime create a symlink from web root to a folder/directory containing files I want to download/xfer to my mac or to another server.
On the machine where I want to acquire the files, create a text file consisting of lines of the url's to those files - one line per file. Looks like:
https://server/mydownloaddirectory/somelargefile1.zip
https://server/mydownloaddirectory/somelargefile2.zip
I call that file getlist
Then, on Mac or other server - where I want to acquire the files - and in a directory where I want them.
wget -i getlist
wget reads the getlist file one URL at a time and acquires the files.
Could use:
'SoS', Ken
'SoS', Ken