rsync and rsyncman job management
This module can schedule ryncs and rsyncman jobs to be able to syncronize data
- Manages rsync package
- Installs rsyncman via rsync::manager
- Creates cronjobs for rsync amb rsyncman
Configure a rsync job everyday at 0:00 to copy data from /origin to /destination on the same server
rsync::scheduledrsync { 'demo':
origin => '/origin',
destination => '/destination',
hour => '0',
minute => '0',
}
Configure a rsync job everyday at 0:00 to copy data from /demo to [email protected]:/demo2, sending a report to [email protected]
rsync::manager::schedule { 'demo':
mail_to => '[email protected]',
host_id => 'demopuppet',
hour => '0',
minute => '0',
}
rsync::manager::job { 'demo':
path => '/demo',
remote => '[email protected]',
remote_path => '/demo2',
}
Usage: rsyncman.py [-c <config file>] [-b]
-h,--help: print this message
-c,--config: config file
-b,--syncback: sync from destination to origin
-d,--dryrun: dry run - just simulate execution
-S,--canarystring: canary string
- manage_package: (default: true)
- package_ensure: (default: installed)
Plain rsync configuration
- origin:,
- destination:,
- ensure: = 'present',
- cronjobname: = undef,
- user: = 'root',
- ionice: = true,
- ionice_class: = '2',
- ionice_level: = '2',
- delete: = true,
- hour: = '*',
- minute: = '*',
- month: = '*',
- monthday: = '*',
- weekday: = '*',
- archive: = true,
- hardlinks: = true,
- one_file_system: = true,
- chmod: = undef,
rsyncman is a python script intended for simplifying failover and failback operations.
Default configuation file for rsyncman is expected to be ./rsyncman.config, but can be used a different file using the -c option, Puppet alredy manages this so there's not need to worry about it unless is manually managed.
The actual configuration file is required to have a global config section (rsyncman) besides of as many paths (sections) that are needed
Global config section
- logdir
- to
- host-id
- pre-script
- post-script
Job specific options (can be configured more than one job) Section name is the local path
- ionice
- rsync-path
- exclude
- delete
- remote
- remote-path
- check-file
- canary-file
- expected-fs
- expected-remote-fs
- default-reverse
- compress
[rsyncman]
[email protected]
host-id=DEMOHOST1234
logdir=/var/log/rsyncman.log
[/test_rsync]
ionice="-c2 -n2"
rsync-path="sudo rsync"
exclude = [ "exclude1","exclude2" ]
delete = false
remote="[email protected]"
remote-path="/test_rsync"
check-file=is.mounted
expected-fs=nfs
expected-remote-fs=nfs
- ensure: = 'present',
- schedule_name: = $name,
- user: = 'root',
- hour: = '*',
- minute: = '*',
- month: = '*',
- monthday: = '*',
- weekday: = '*',
- mail_to: = undef,
- host_id: = undef,
- logdir: = '/var/log/rsyncman',
It MUST belong to an rsync::manager::schedule specified using the schedule_name option
- path:
- remote:
- remote_path: = undef,
- schedule_name: = $name,
- ionice_args: = undef,
- rsync_path: = undef,
- exclude: = [],
- delete: = false,
- check_file: = undef,
- expected_fs: = undef,
- expected_remote_fs: = undef,
- order: = '42',
- default_reverse: = false,
- compress: = false,
Example:
class { 'rsync::manager':
}
rsync::manager::schedule { 'demo':
mail_to => '[email protected]',
host_id => 'demopuppet',
}
rsync::manager::job { 'demo':
path => '/demo',
remote => '[email protected]',
exclude => [ 'a', 'b', 'c' ],
remote_path => '/demo2',
}
Tested on CentOS 6/7 and on Ubuntu 14.04 but should work anywhere. rsyncman is a python script, so python needs to be installed on the system
We are pushing to have acceptance testing in place, so any new feature should have some test to check both presence and absence of any feature
- Fork it
- Create your feature branch (
git checkout -b my-new-feature
) - Commit your changes (
git commit -am 'Added some feature'
) - Push to the branch (
git push origin my-new-feature
) - Create new Pull Request