Send UpdraftPlus backups to different S3 folders

Managing WordPress backups securely with Amazon S3 requires a little chicanery. Because the website itself can host credentials for the remote backup storage, we need to protect that storage from being deleted or overwritten using those credentials. Which means the website is unable to manage backup retention policies itself.

For some years now, I’ve been storing backups for client websites on Amazon S3. Each website gets its own bucket and a set of credentials for loading new backups. None of them get permission to delete backups, and versioning is turned on to prevent loss by overwriting.

This is necessary to prevent the risk of a hacker gaining access to a website, stealing the S3 credentials, and wiping the backups to prevent the website being easily restored after an attack. Imagine a ransomware attack hitting a website, demanding a ransom, and all backups deleted 😱 — now imagine the hacker’s face when they can’t delete the backups 🤣

Security professionals have been promoting secure backups for years, and the good folk at UpdraftPlus wrote an excellent guide on how to create a policy for Amazon S3 that I dutifully pinched as the basis of the policies I use for client website backups.

Another thing I like to set up with client backups is a two-tier retention system, where daily backups are kept for 30 days and weekly backups are kept for 365 days. Amazon S3 lifecycle policies make that sort of thing very easy to set up, when the two types of backups have different prefixes (e.g. folder names).

Up until now, I’ve been using BackWpUp to run those backups. That plugin makes it very easy to send different backup jobs to different destinations on Amazon S3, which is perfect for managing them with different lifecycle policies. And then I met UpdraftPlus.

UpdraftPlus has one (1) destination available for Amazon S3 backups. But it does have a useful filter hook, which can be used to add folder prefixes to the destination path!

/**
 * intercept UpdraftPlus backup options to modify the path based on conditions
 * @param array $options
 * @param UpdraftPlus_BackupModule $module
 * @return array
 */
add_filter('updraftplus_backupmodule_get_options', function($options, $module) {
    if ($module instanceof UpdraftPlus_BackupModule_s3) {
        $folder = 0 === (int) date('w') ? 'weekly' : 'daily';
        $options['path'] = rtrim($options['path'], '/') . "/$folder";
    }
    return $options;
}, 10, 2);

The updraftplus_backupmodule_get_options filter hook lets us modify the destination path of the backup. In the code above, if it’s a Sunday, the backup is put into a “weekly” folder, otherwise it gets loaded to a “daily” folder. We can then set up separate lifecycle rules in Amazon S3 to keep Sunday’s backups for a year, but ditch the others after only a month.

It took a little extra chicanery than usual, but job is still done.