No, no, no, please **don't** use the inbuilt export features! Both tie up an instiki process and use plenty of resources while the export is in progress. The HTML is the worst as instiki has to do the conversion as well. In addition, these only give you a snapshot and don't include the revision history.

Actually, at the current size then this probably isn't a huge resource hog on Azimuth, but it doesn't scale well and we've disabled it on the nLab.

There are two ways to keep regular backups. The first is to take copies of the database. This is certainly the easiest to recover from, but as it contains _everything_, it can't be distributed. Included in "everything" are the main password and if we ever have personal webs all of those will be included, including the private ones.

The other way is via the BZR repository. Each day, the server runs a script that exports the pages to a BZR repository and each day I pull that from the server to my work machine here in Norway. The export script exports each page change as a separate commit, so _everything_ that would be needed to reconstruct the wiki is included, barring a few stylesheet tweaks and the uploaded files (I could easily add those to a backup scheme). This system works on a web-by-web basis, and doesn't include sensitive information, so is completely safe for public distribution.

The current size of the repository is 13Mb, which isn't too big (for comparison, the nLab is nearing 200Mb), but is a bit big to download in one go from the server so if anyone wants to start doing this, I'd prefer to send them a "seed file" to get them started. After that, doing a daily update doesn't involve much bandwidth, and remembering to do the update is easy with automated scripts.

(Incidentally, this scheme is just something I dreamt up when thinking about how to solve this issue on the nLab; if anyone has any suggestions on how to improve it then please do say so.)