Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 3960 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

In the FMS 13 Getting Started Guide it staes that a folder on a network drive can be designated as the default backup destination or the destination for progressive backups (different folder) as long as the drive is always mounted. Since my server machine, a Mac Mini, never sleeps and the network drive is always mounted I thought there wasn't any reason not to use the drive. However whenever I try to designate a folder on this drive as a destination admin console will not save the configuration. The reason I want to do this is that the server machine has a relatively small drive (500gb) and the network drive is 4tb. I don't want to limit my backups to ten or twenty instances.

Any opinions/suggestions?

Posted

Assuming you have the syntax correct, it is probably a permissions issue.  FMS runs as user fmserver from group fmsadmin.  Make sure the destination folder has those as the owner.


As a best practice though it's not always a good idea to back up directly to a NAS:

 

- ethernet is much slower than internal i/o; so backups will take longer than necessary

- since it is an external device the number of "moving parts" increases and with it the risk of something going wrong

 

It may be a better idea to backup locally first and then use an OS script to copy the backup to the NAS

Posted

A very simple folder action script created from Automator on the target backup folder will do the job of copying contents to another location. Much simpler than creating a shell script (for me) and the copy doesn't appear to start until the backup is complete

Posted

A very simple folder action script created from Automator

 

 

Maybe not.. using a shell script means you have full control over every aspect.  A folder action may fire at the first sign that a file has been added to the folder so it may actually try to grab files while FMS is still backing them up.  FMI have very clearly stated that NOTHING should touch the backup folders until a backup is finished.  There is a very specific backup process at the tail end of the backup where FMS will pause the hosted files and synchronize everything that happened in the live files since the start of the backup process back into the backed files.

 

You DO NOT want to have some OS-driven feature interpreting what needs to be done and when.

 

You may be lucky that all your backups are fine, but that is not a given.

For the same reason you do not want to backup directly to a Dropbox folder or anything else that will try to do its own thing outside of your control.

 

If you do insist on using that kind of external functionality then do check your backups very very often.

Posted

Wim,

Can you have TimeMachine backup the local backup folder. It seems that is dangerous as well?

Barbara

Posted

So, what is the best practice to backup the backups?

 

Use the FMS CLI to complete a backup and then push it out to a location where it safe for the external backup to backup from so that the external backup software never has to touch the live or backup folders.

Posted

Assuming you have the syntax correct, it is probably a permissions issue.  FMS runs as user fmserver from group fmsadmin.  Make sure the destination folder has those as the owner.

As a best practice though it's not always a good idea to back up directly to a NAS:

 

- ethernet is much slower than internal i/o; so backups will take longer than necessary

- since it is an external device the number of "moving parts" increases and with it the risk of something going wrong

 

It may be a better idea to backup locally first and then use an OS script to copy the backup to the NAS

That's likely it. However the user and group are considered invalid. I'll try again from the host machine.

 

Thanks.

Posted

So, pause, backup, resume and then use a shell script to (copy) push it out to a location where it safe for the external backup to backup? (All listed clearly in FMS12_help.pdf)

 

I don't typically admin the servers on the projects, obviously. How does one usually "push" the backup directory? Isn't the resulting directory named for the date/time it was created? How does one then identify which to push?

Posted

No, you don't have to pause the files.  That's old school and depending on how big the files are that may have too big of an impact on the users.  The FMS cli command "backup" uses the true backup mechanism: it copies the live files to the backup folder and then very briefly at the very end pauses the live files to sync any changes made since it copies the live files at the start of the process.

The pause it uses is much shorter than if you manually pause the files then copy the files over.

 

As to identifying the correct backup set: 

 

http://fmforums.com/forum/topic/39181-backups-more-backups/

 

http://www.filemaker-solutions.be/IMG/pdf/MoreBackups_v2.pdf

 

That's been around for almost 10 years now :)

 

Not picking on you, Barb, not at all.

But way too many FM developers don't get involved with the deployment and ultimately that is exactly where your solution will be evaluated.  It may be the most efficiently crafted FM solution out there but if it is deployed in a crappy manner and backups fail and the deployment fails, it will all have been for naught.

Posted

Wim says:

 

 

But way too many FM developers don't get involved with the deployment and ultimately that is exactly where your solution will be evaluated.  It may be the most efficiently crafted FM solution out there but if it is deployed in a crappy manner and backups fail and the deployment fails, it will all have been for naught.

 

Everyone please take note of this.  This is a critical point. And in the future it will be even more so.

 

Steven

Posted

Getting back to my original question, I'm going to plug in a sizable Thunderbolt or USB drive to the Mac mini directly and have the backups saved there. The backups will be fast and reliable. Of course this means I will always have to be logged in to my system account (I believe). I only have a few hundred gigs left on the Mac mini drive and to keep the number of backups necessary that drive space will soon disappear. Any opinions are welcome.

Rick.

Posted

Here's the normal process I go through when discussing this with a client:

(realizing that this is for your dev server, but the thinking process should be same)

 

Risk = Vulnerability x Probability x Impact

 

- first off acknowledge that there is a greater risk.  The vulnerability is accidental removal of the external drive (and it happens all the time!).  Probability: hard to say, depends on the quality of the hardware and where it is going to be deployed.  Impact: can be big if there are not other internal backups it could mean no backups at all are happening and only the client can decide what kind of impact that has

 

- so set up at least some local backups to counter that risk

 

- when setting up a backup schema for clients there are two questions:

1) Restore Point: how much data are  you willing to lose?  Is it ok if we have hourly backups, daily backups?...

2) Restore Timeframe: how quickly do you need to be back online?  How much downtime can you afford?

The answers to these two questions pretty much drive the cost of the backup setup.

 

- speed: should be ok on the TB, may not be good enough with USB depending on the size of the solution.  Slow backups = more risk, and greater impact on the users

 

- definitely do not leave the machine logged in.  That's just a no-no from a security point of view

Posted

Thanks Wim,

My desktop and laptop are the only clients. This is more a matter of sync for me. On my desktop I had a solid backup scenario but the Save as a Copy command won't run on FMS. Hence the need for a solid backup syste.

Rick.

Posted

Good info, thanks. As a subcontractor, I am not usually involved in deployment...

Posted

I ended up getting a 2tb USB3 drive and it works a treat. I can now have the full 99 scheduled backups as well as progressive backups without worrying about storage space. The group of files backed up totals less than 50mb.If they were huge I'd spend the extra dough for Thunderbolt (and probably will eventually) but at this point this way works fine.

Wim, the issue with the NAS wasn't permissions as the new drive has permissions typical to a drive attached to any of my machines and they include neither fmserver or fmsadmin. But it works. It took a while to figure out how FMS backs up, but now I understand that, with progressive backups and regular backups enabled, the regular backup folder contains what you'd expect (I have it set to daily) and the progressive backup folder always keeps up to date files plus the "changes". I have the progressive backups interval set to 10 minutes.

My thanks to all who have helped me with this issue,

Rick.

Posted

All developers, whether they administer the server where the solution is deployed or whether they are responsible for any deployment options at all, need to understand the factors that govern performance and functionality on FileMaker Server.

 

As Wim pointed out, the success of the solution and the perception of that success are governed by what happens when the solution is actually deployed.  Threads such as this one add to the knowledge base for deployment best practices.  There will be more information coming on this topic area.

 

Steven

This topic is 3960 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.