Wim Decorte Posted December 19, 2005 Posted December 19, 2005 I've updated the Backup white paper. The article describes how you can set up backups with FileMaker Server, using just 3 backup schedules to produce an unlimited amount of date/time-stamped zip backup sets and move them across the network. Both Windows and OSX.
Lee Smith Posted December 19, 2005 Posted December 19, 2005 Hi Wim, LOL How about a clue where the White Paper is?? Lee
Wim Decorte Posted December 20, 2005 Author Posted December 20, 2005 My website would be a good start wouldn't it? : http://www.connectingdata.com/downloads.htm
xochi Posted December 27, 2005 Posted December 27, 2005 Thanks for the hints! Your solution is for Windows only. Here is a quick & dirty shell script I wrote for FM Server 7 under Mac OS X. Note that it doesn't have any error checking to speak of, and thus will probably not do the right thing under certain errors (such as disk full). It uses the command line version of StuffIT, but you could probably adapt this to use other free shell command tools (such as tar or ditto). It also puts the password in the script, which is probably not the best idea. Use it as a starting point to customize for your particular system: #!/bin/sh ###################################################################### # Shell script called by FileMaker Server 7 to do daily archives of # backups. # # Method: # After FMS7 makes the 6am backup, this script is called # we use Stuffit command-line tool to compress and encrypt the # backup files with today's date, and then move the archive to # both a local folder and a 2 other hard drives, for safe keeping # ###################################################################### ###################################################################### ROOT='/Users/cfa/Data/FM7/Backups/Daily/' FILES='6am/DATA.fp7 6am/OtherData.fp7' DATE=`date '+%C%y-%m-%d'` ARCHIVE='Backup'$DATE'.sitx' DEST1='/Users/abc/BackupAlpha/Archived/' DEST2='/Volumes/Beta/BackupBeta/Archived/' DEST3='/Volumes/Gamma/BackupGamma/Archived/' PASSWORD=MyPassword34234 echo "######################################################################" echo "# FMDailyArchive.sh -- Archiving Databases: START" echo "######################################################################" # Stuff/encrypt the relevant files cd $ROOT /usr/local/bin/stuff --quiet $FILES --format=sitx --password=$PASSWORD --name=$ARCHIVE # make sure destination folders exist, create them if needed mkdir -p "$DEST1" mkdir -p "$DEST2" mkdir -p "$DEST3" # Copy / move the files to the destination locations cp "$ARCHIVE" "$DEST3" cp "$ARCHIVE" "$DEST2" mv "$ARCHIVE" "$DEST1" # log the files locations, names and sizes ls -lag "$DEST1$ARCHIVE" ls -lag "$DEST2$ARCHIVE" ls -lag "$DEST3$ARCHIVE" echo "######################################################################" echo "# FMDailyArchive.sh -- Archiving Databases: DONE " echo "######################################################################"
cbum Posted March 9, 2006 Posted March 9, 2006 (edited) Are there any solutions that offer record-level backups? When you have large DBs, it becomes ridiculous to backup the whole file because of a few changes, and all OS-level solutions just see that the file has changed... Edited March 9, 2006 by Guest
Wim Decorte Posted March 10, 2006 Author Posted March 10, 2006 Not so ridiculous as it facilitates quick recovery (no need to merge individual record level backups). The way FMS takes backups, there's virtually no effect on the connected guests. With "all OS-level solutions": I hope you don't mean that you're using some app to take backups of the hosted files while they are hosted? That's a recipe for disaster.
cbum Posted March 11, 2006 Posted March 11, 2006 No, I use the FM server for duplicate backups, and only backup from those using retrospect. But I have >10GB files, and it's a big hassle for the relativley few changes that occur...
Wim Decorte Posted March 12, 2006 Author Posted March 12, 2006 You could roll your own "record backup" if you want: a closing script that exports the records that were changed in that session (by using the "modification date")
xochi Posted April 5, 2006 Posted April 5, 2006 check out the unix (mac os x, perhaps one exists for windows?) called "rsync". I believe it will properly back up a file that has changed by only sending the changed bytes. However, you'd have to make sure you used this on a backup file, not on the live database file. I've never actually done this, so if it causes your machine to explode, don't blame me man rsync NAME rsync - faster, flexible replacement for rcp SYNOPSIS rsync [OPTION]... SRC [sRC]... [uSER@]HOST:DEST rsync [OPTION]... [uSER@]HOST:SRC DEST rsync [OPTION]... SRC [sRC]... DEST rsync [OPTION]... [uSER@]HOST::SRC [DEST] rsync [OPTION]... SRC [sRC]... [uSER@]HOST::DEST rsync [OPTION]... rsync://[uSER@]HOST[:PORT]/SRC [DEST] rsync [OPTION]... SRC [sRC]... rsync://[uSER@]HOST[:PORT]/DEST DESCRIPTION rsync is a program that behaves in much the same way that rcp does, but has many more options and uses the rsync remote-update protocol to greatly speed up file transfers when the destination file is being updated. The rsync remote-update protocol allows rsync to transfer just the dif- ferences between two sets of files across the network connection, using an efficient checksum-search algorithm described in the technical report that accompanies this package. ...
xochi Posted May 8, 2006 Posted May 8, 2006 It uses the command line version of StuffIT, but you could probably adapt this to use other free shell command tools (such as tar or ditto). It also puts the password in the script, which is probably not the best idea. NOTE: I just discovered that Mac OS X 10.4.6 upgrade breaks the StuffIt command line tool (version 8). Upgrading to version 10 fixes it. I didn't realize this until i discovered that my daily backups for the past week were missing. Yikes!
FileMaker Magazine Posted September 19, 2013 Posted September 19, 2013 I'm just sticking this here in case I forget where I put it on my hard drive. Oh, and because other FileMaker devs using Server for development on OS X may like using it too. If you don't know VIM then the part about using sudo visudo may be frustrating. This is a local network backup script which can be turned into a remote backup script if you know rsync. You can see that I'm using a mounted shared on /Volumes/Backups - which in my case just moves the backup dbs from the local Raid 1 dev server drive to another Raid 1 drive (can't be too paranoid can you). It simply takes the last listed backup from the folder at /Library/FileMaker Server/Data/Backups/ and copies it to another location using rsync. #!/bin/bash # Specify locations FROM='/Library/FileMaker Server/Data/Backups/' TO='/Volumes/Backups/' FOLDER=`/bin/ls -lha /Library/FileMaker Server/Data/Backups/ | tail -n 1 | cut -d ' ' -f 13` # The following lines are for testing # You can test this script with the following command line # sudo -u fmserver /path/to/THIS_SCRIPT_NAME #/bin/echo "Copying... $FROM$FOLDER" "$TO$FOLDER" #/bin/cp -Rp "$FROM$FOLDER" "$TO$FOLDER" # Here's the real command which does the work - see note below!!! /usr/bin/sudo /usr/bin/rsync -rptD "$FROM$FOLDER/" "$TO$FOLDER" # In order for the fmserver user to be able to use rsync # you need to allow it to use it as sudo with no password. # Use this command to edit # > sudo visudo # Enter the following line (without the # character) # to the end of the file - which is at /etc/sudoers # fmserver ALL=NOPASSWD: /usr/bin/rsync # Also note that in order to copy the files, the target # TO destination must be world writeable. On a Mac, I # simply use Guest access to a shared folder with # permissions set to Everyone with write only (Drop Box) # http://help.filemaker.com/app/answers/detail/a_id/7275/~/filemaker-server-event-log-messages
naio Posted September 29, 2016 Posted September 29, 2016 On 19/9/2013 at 11:13 PM, FileMaker Magazine said: It simply takes the last listed backup from the folder at /Library/FileMaker Server/Data/Backups/ and copies it to another location using rsync. Thanks for this valuable script. Shouldn't the rsync command include the option -H? This would preserve hard links, which is important if you are backing up a large amount of files, for example those on a managed container field folder.
Recommended Posts
This topic is 2976 days old. Please don't post here. Open a new topic instead.
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now