[mythtv-users] Using the mythconverg_backup.pl script (slightly OT)

Nick Rout nick.rout at gmail.com
Thu Dec 3 03:43:02 UTC 2009


On Thu, Dec 3, 2009 at 4:41 PM, Nick Rout <nick.rout at gmail.com> wrote:
> On Thu, Dec 3, 2009 at 4:35 PM, Harry Devine <lifter89 at comcast.net> wrote:
>> I've been using the mythconverg_backup.pl script in a cron job for a few
>> weeks now.  I have it on a 7 day rotation.  I have another PC in the house
>> that is running plain-jane Ubuntu 9.10 and I want to archive my database
>> backups to that PC.  I have the authorized_keys mechanism working properly
>> between the 2 PCs, but I'm a little unsure of how to get only the latest
>> file over there.
>>
>> What I'd really like to have is the same 7 files on both machines.  But how
>> do I do this?  Do I issue some sort of remote shell command to delete the
>> files on the backup PC then copy over the Myth backup files?  Perhaps create
>> a cron job on the backup PC to cleanup the directory at a time BEFORE the
>> new backups come over?
>>
>> I'm sure I'm over thinking this a little, but I'm open to suggestions.
>
> rsync --delete -az /home/of/backups/* othermachine:/place/for/them/there/
>
> --delete will delete from the destination directory anything that is
> not in the source directory so you don't get a big build up in there.
>
> My suggestion assumes that there are no extraneous files in /home/of/backups/
>

you could probably include it at the end of the same script that cron
is already running to create the backups in the first place.


More information about the mythtv-users mailing list