Question

Is there an easy way to automatically backup all useful directories to a .zip file?

Answer

The Attach:backup_pmwiki.txt script (rename it to php) adds an "?action=backup" which can be used to backup all the useful directories (and subdirectories). You can configure the directory where to place backups, set up the backup format and if you want, at the end of creation of backup file, show a link to download it.

This script is inspired from BackupPages, but saves all directories mentioned at BackupAndRestore (and not only wiki.d).

It also fixes several bugs mentioned at BackupPages.

Backup directory might be created by hand and given 777 rights.

The variable redefining the backup directory can be set before the script inclusion, as well as other parameters (see script).

It is better that this script is set only for a dedicated page, and so, the include set in a file associated to this page like

 /local/Admin.BackupWiki.php

This file may content something like :

  <?php if (!defined('PmWiki')) exit();
  //$BackupDir = '/mybackupdirectory/';  // defaults to '/backup/'
  // other parameters : see backup_pmwiki.php
  include_once('cookbook/backup_pmwiki.php');
  ?>

Restrictions

This backup page can be read protected though ?action=backup is still possible on a read protected page. A cryptic name for the page and directory can help protection, or back_pmwiki.php can be edited to use an action name different from backup.

Anyone can download the backup archive file if they know or can guess its URL.

See also

Discussion

Feel free to edit this page if you have any remarks... (edit password = backup)

Nicolas August 15, 2006

I have had trouble backing up our 'pub' directory which contains enough images to make the archive bigger than 50Mb. When it gets this big, the script crashes. So I have excluded the pub directory from the backup and it works fine for 20mb archives. Is this a bug, or something to do with my servers' setup?

Francis 02 septembre 2006 à 05h19

I successfully generated zip files larger than 130 Mo, but sometimes, for this kind of big archives, it seems that the HTML output between http server and client is "to slow" and may let think the script crashes. In fact, the zip is correctly generated (and can be obtained via FTP).

If anyone as an idea to prevent this "HTML output hanging", it would be helpful...

Nicolas September 3, 2006

  • Actually, with my problem this wasn't the case - I downloaded the file by ftp and it was a corrupted archive that couldn't be opened

Francis 04 septembre 2006 à 10h42

Try the new version (V1.1, updated with new version of zip package, fixing some bugs). It may solve your problem...

Nicolas October 17, 2006

Try
set_time_limit(#of seconds); #for dev only!
in config php - I set mine very high for dev, pmwiki can crunch for 24 hours if I want. Seth

History

  • V1.1 October 17, 2006 : integrated V2.1 of zip package from Devin Doucette. This may fix some bugs...
  • V1.0 August 15, 2006 : first version.

Contributors

SteveAgl - original script author

Nicolas - fixed bugs, and added backup of all useful directories

Category: Administration

 visits