TL;DR skip to the summary for the bash one-liner if so inclined :-) ## Why & wherefore I have an old tumblr that I want to back up. I don't use it regularly, but I want to have a backup of the whole site, design as well as content, just in case; it's not just the pictures I love, but the whole thing. I'm not particularly worried about downloading the streaming audio--can't help you there, gentle reader. This is the story of a man, a man page, and a page. And lots of other pages. ## Downloading just one page (and all the images, fonts, JS, CSS) Before trying to download the whole site, I started with just one page. I want all the assets, not just the images or underlying HTML, and I want the links converted, so I can look at the files offline. I found this swell incantation in the wget man page: wget -E -H -k -K -p http:/// It worked, but not perfectly: * Unfortunately, the -E (--adjust-extension) option renamed my webfonts from Blah.eot to Blah.eot.html. Not cool, bro. The image names came out goofy as well, so it didn't really help at all. * This didn't do any rate-limiting. AWS hosts all of tumblr's images, so I'm sure they block an IP that goes bananas with downloading. We can tell it to pause between downloads using --wait=