Using WGET to download or back up an entire website

Simple to do with Linux from the command line.  This will make a folder with the name of the site, and download all the content it can access into that folder.

wget -c -r -k -U Mozilla www.thesite.com

-c skips any existing files already downloaded (handy if you're resuming after only downloading half the content)

-r downloads the whole site (recursive)

-k converts the links to local links so they will work if you click on them from the folder on your computer