It took me a while to understand backups. Backups to the default location can be managed only through the Moodle GUI and are suited for Moodle users. Backing up to a specified folder is for a SysAdmin as the backups don't appear in the GUI. A trade off between technical and non technical users. With the designated folder, it is assumed you know what you are doing.
wget is very useful but using it to retrieve backup files is not a good idea. Given only knowledge of the Moodle site URL, the MBZ extension and the wget recursive option, all the backup data can be scraped from a site, extracted and read by anyone.
Using FTP is better because the backups can be created below the level of the /var/www/html tree, perhaps in a /home/user folder. FTP clients use passwords so it is a much more secure system.
But a FTP download through Filezilla is a manual operation.
All Moodle production sites need daily automated off-server backups. Having Moodle and all your backups on one server is too risky and so is having people remember to copy files.
I like the Dropbox solution because Dropbox is never turned off, never goes to sleep and the pipes between my VPS datacentre and Dropbox are much faster and more reliable than any home connection. And it is free. I still have some configuring to do because although I automatically delete backups older than a week on my server:
find /home/user/backup/ -type f -name '*.mbz' -mtime +7 -exec rm {} \;
I have to manually delete files on Dropbox.
A home computer could be used for offsite backup, as I did last year using rsync. The computer wouldn't even have to run all day, as rsync can be scheduled multiple times to check for any changes so a missed sync wouldn't matter much. To set that up, a ssh tunnel with public and private keys would be required to allow connections to a remote server without a password. On Linux it is easy but I have never tried on other operating systems.