Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 5259 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted (edited)

Hello,

I am just upgrading our server from version 8 to 11. I did not know until now that backups were now each in their own folder. I understand the upside of allowing multiple backups and preventing them being overwritten.

Having a fixed location was a plus in this situation: we need a duplicate of the database at a remote location - the data is replaced every night and using "day-old" data is not an issue and performance is better than using the network. So the script used by the remote machine was "pulling" a backup of the database from the same location every night. Now, with a new name every night, it no longer works...

We can't do a file copy of the live data, can't "save as" while the files are hosted...

My question: is there a simple way to duplicate/backup a database in a specific location? thanks for your thoughts.

Edited by Guest
Posted

So the script used by the remote machine was "pulling" a backup of the database from the same location every night. Now, with a new name every night, it no longer works...

This is essentially backwards of what should have been done. And it poses risks to the server and the files.

[color:red]Push the files out to the remote location instead. See the backups information from Wim Decorte's website for several approaches to doing this.

Steven

Posted

Whether push or pull, I was looking for a way to avoid a folder name that changes with every backup... Or a method to obtain a copy/duplicate of the database in a fixed location.

Posted

What you want to do:

Inside the standard folder for FileMaker Server backups, do create a folder and name any way you like it - "MyCopyForRemoteSite". Now create a schedule in FMS Admin Console to save your database file(s) once a day into that folder, keeping only one version (meaning overwriting the contents tomorrow.)

Now you have the folder where to look for data from you system script; that script will first prepare, then push (copy) the data to the remote site.

In my scripts. I first copy data to my work directory, then use zip with the options -rj (r traverse directories recursively, j junk, no directories included) and *.fp7 for the file mask. Voila, in the resulting archive.zip are only the files included; no timestamped folders.

Make sure to rename the archive.zip or do whatever find appropriate to identify todays' copy from yesterdays'.

Regards

Volker

This topic is 5259 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.