There is an alternative to wget:AUR, which is a multi-threaded download application that can significantly improve download speed.
Configuration is performed in
/etc/wgetrc. Not only is the default configuration file well documented; altering it is seldom necessary. See the man page for more intricate options.
Normally, SSH is used to securely transfer files among a network. However, FTP is lighter on resources compared to scp and rsyncing over SSH. FTP is not secure, but when transfering large amounts of data inside a firewall protected environment on CPU-bound systems, using FTP can prove beneficial.
wget ftp://root:email@example.com.X.Y//ifs/home/test/big/"*.tar" 3,562,035,200 74.4M/s in 47s
In this case, Wget transfered a 3.3 GiB file at 74.4MB/second rate.
In short, this procedure is:
- faster than ssh
- easily used by languages than can substitute string variables
- globbing capable
Wget uses the standard proxy environment variables. See Proxy settings.
To use the proxy authentication feature:
$ wget --proxy-user "DOMAIN\USER" --proxy-password "PASSWORD" URL
Proxies that use HTML authentication forms are not covered.
To have pacman automatically use Wget and a proxy with authentication, place the Wget command into
/etc/pacman.conf, in the
XferCommand = /usr/bin/wget --proxy-user "domain\user" --proxy-password="password" --passive-ftp -q --show-progress -c -O %o %u
chmod 600 /etc/pacman.conf.
This section explains some of the use case scenarios for Wget.
One of the most basic and common use cases for Wget is to download a file from the internet.
$ wget <url>
When you already know the URL of a file to download, this can be much faster than the usual routine downloading it on your browser and moving it to the correct directory manually. Needless to say, just from the simplest usage, you can probably see a few ways of utilising this for some automated downloading if that's what you want.
Archive a complete website
Wget can archive a complete website whilst preserving the correct link destinations by changing absolute links to relative links.
$ wget -r -np -k 'target-url-here'
In case of a dynamic website, some additional options for conversion into static HTML are available.
$ wget -r -np -p -E -k -K 'target-url-here'
wget also provides options for bypassing download-prevention mechanisms.
$ wget -r -np -k --random-wait -e robots=off --user-agent "Mozilla/5.0" 'target-url-here'
And if third-party content is to be included in the download,
-H switch can be used alongside
-r to recurse to linked hosts.