Simon Griffee

Easy Way to Backup a Website

22 April 2014 · 1 minute read

_This is an edited version of my comment posted at TOP._

I’ve recently had some trouble with an old web host and have been thinking about backups.

An easy way to backup an entire site as it is, with comments and styles intact, is to fire up a Unix utility called wget and have it crawl your site and save a static HTML version of it in a folder in your computer. You can then move this folder to your own server or web hosting company easily.

The command may take a while to run depending on the size of your site, but is generally very fast, and gives you an HTML archive that can be served by a web server and browsed with a web browser regardless of service or software changes.

First you will need to install wget if you don’t have it already.

Then you can run the following command in your terminal (In Mac OS X the Terminal application is in /Applications/Utilities/Terminal. Replace ‘domain.dev’ — which appears twice — with your site domain):

wget --recursive --no-clobber --page-requisites --html-extension --convert-links --restrict-file-names=unix --domains domain.dev http://domain.dev/

Once the command finishes processing you will have a folder containing the site’s contents in your computer.

What

Share