From the department of things-that-I-know-are-possible-but-can-never-remember-how-to-do-so-hey-I-read-the-manpage-and-now-I'm-blogging-it, I bring you "downloading a directory of photos from a website":

 wget --recursive http://example.com/photos/some-event/ --no-directories --directory-prefix <local-folder-name> --accept JPG,RW2

I always remember wget --recursive (or wget -r for short), but that produces an annoying tree of directories starting with the website's domain and working its way up to the directory you actually want. In the command above, --no-directories removes the tree, and --directory-prefix tells wget to put the downloaded files somewhere that's not the current working directory. The --accept option tells wget to discard files with extensions other than those mentioned, so your downloaded directory is not cluttered with webserver-generated files like index.html if you don't want it to be.

Here's the short version, since the long version is nice to remember but not so nice to type:

wget -r http://example.com/photos/some-event/ -nd -P <local-folder-name> -A JPG,RW2
try lftp on http:// urls, and then you can use mget or even mirror. I find that even more intuitive than plain wget.
Comment by siretart [launchpad.net] Wed 14 Jul 2010 12:21:28 AM UTC
I'll have to check out lftp, thanks! One advantage wget does have is that it's available on basically any *NIX box, though.
Comment by christine-spang [myopenid.com] Sun 25 Jul 2010 04:47:39 PM UTC