Here's a concrete example combining ssh keys with forced commands and
Rsnapshot's preexec directive to dump my Subversion repository and
PostgreSQL database before backup.
## from rsnapshot config file for serverX. Note that the first
whitespace is a tab, the rest are spaces.
cmd_preexec /usr/bin/ssh -i /root/.ssh/serverX-backup_preexec-key
***@serverX /root/rsyncdir/backup_preexec
## from serverX:/root/.ssh/authorized_keys
command="/usr/local/bin/backup_preexec" ssh-rsa [redacted public key]
## serverX:/usr/local/bin/backup_preexec
## This is designed so that the output merges nicely into Rsnapshot's
log,
## and failures are kept to a minimum so that backups proceed if
feasible.
#!/bin/sh
RVAL=0
echo -n " serverX svn_backup "
exec /usr/bin/svnadmin hotcopy --incremental /redacted/path
/redacted/path.hotcopy
EVAL=$?
if [ $EVAL -ne 0 ]; then
echo "failed with $EVAL"
RVAL=1
else
echo "completed successfully"
fi
echo -n " serverX pg_backup "
## The following is an overly elaborate bash script that manages dump
## rotation, the core command is
## su - postgres -c "/usr/bin/pg_dumpall >$BDIR/$BASE.0"
/usr/local/bin/pg_backup
EVAL=$?
#echo " rval = $EVAL"
if [ $EVAL -ne 0 ]; then
echo "failed with $EVAL"
RVAL=$((RVAL+2))
else
echo "completed successfully"
fi
return $RVAL
On Sat, May 16, 2015 at 11:32 AM, Tapani Tarvainen <
Post by Tapani TarvainenPost by John LewisI considered using a rsync daemon, but I won't be able to dump databases
remotely doing a backup.
Not directly, true. You could dump the database to a file and
transfer that, though (using cron or rsyncd's pre-xfer exec),
but you'd need extra disk space in the client for the dump.
Post by John LewisRight now I run a dump via a script that dumps a database to standard
output that I execute over a remote ssh session. Right now the easiest
way to lock things down a bit more is to use ssh keys with restricted
commands, but I would have to write a wrapper script around the database
dump commands and install them on the clients because making bash the
only command executable actually presents a lot of options. Going
without a wrapper would give me a high likely hood of having to
regenerate key pairs every time I need to dump a new kind of database.
Why? I must be missing something here.
I presume there is a reason why you can't use the simple approach of
opening the database to the backup machine and using rsnapshot's
backup_script feature to run mysqldump or pg_dump or whatever
directly?
But doing it over ssh isn't much harder.
It would seem to me all you'd need is to add a new dump user for
each database and allow just the appropriate dump command.
No need for extra keys that I can see.
Alternatively, a bit more complicated wrapper could handle many
different kinds of databases with single user account.
Post by John LewisIf rsnapshot was like ansible and could use sudo, I could store the
password in a shell variable, prevent access without an ssh key, and
limit the commands executed with sudo without needing to develop,
package, and install wrapper scripts.
You'd still need to manage custom sudo rules. It isn't obvious to me
why they'd be easier than wrapper scripts. And having passwords in
shell variables is generally less safe than public key authentication.
Post by John LewisAfter thinking about it more, generating a new key with more commands
I'm confused. How are commands tied to keys?
There must be something in your setup I don't understand.
Post by John Lewiswould by far be the easiest and most transparent thing to do, but it may
make configuring the backup harder because I may not be able to store
the backup script in a file different than the rsnapshot job.
Why not? And anyway, a single rsnapshot job can use multiple
login accounts and ssh keys, too, if needed.
--
Tapani Tarvainen
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
rsnapshot-discuss mailing list
https://lists.sourceforge.net/lists/listinfo/rsnapshot-discuss
--
Paul Mackinney
Systems & Quality Manager
O.N. Diagnostics, LLC
2150 Shattuck Ave. Suite 610, Berkeley, CA 94704
510-204-0688 (phone) | 510-356-4349 (fax)
_____________________________________________
If you receive this message in error, please delete it immediately. This
message may contain information that is privileged, confidential and exempt
from disclosure and dissemination under applicable law.