Mirror an FTP website with wget

What do you do when you want to copy a website but don’t have SSH access and can’t use rsync? If you have an FTP login you can use wget to copy the files from one server to the other.

The command

wget -r -l 20 ftp://username:password@www.example.org/htdocs/*

will copy all files from the htdocs folder on www.example.org to your current directory. By default wget will dive only 5 subfolders deep into the site, so it would not copy a directory like /htdocs/sub1/sub2/sub3/sub4/sub5/. The -l 20 parameter tells wget to look 20 subfolders deep into the site which should be sufficient for most sites.

wget will automatically create the subfoler www.example.org/htdocs on your local host in the example above. If you want to stop wget from creating this directory you can trim the path with the nH and cut-dirs parameters.
nH will tell wget not to create a directory named like your hostname, here www.example.org.
cut-dirs=x will tell wget to ignore x further subfolders.
If you would like to just copy all files in the htdocs folder into the current directory without having any additional subfolders created, the command above would become

wget -r -l 20 -nH --cut-dirs=1 ftp://username:password@www.example.org/htdocs/*

Mirror an FTP website with wget
Share this