Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 5419 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

Apparently, our hourly server backup which pauses the database is causing our PHP web clients to receive a fatal error (as opposed to the FM clients which receive a "coffee cup"). I would imagine this is common and that there's a best practice. We appreciate all comments, Barbara.

Posted

Are you sure that backup every hour is necessary?

I have similar problem and only solution I can find is to make backups not more than once per day and not in rush hours. Server is located on iMac, Tiger 10.4 - that isn't very good idea at all: server hardware must be used for serious systems. Otherwise Mac is so stable and I'm sure that I cannot loose data.

Anyway - this very interesting topic!

Posted

Is the backup process explicitly pausing the databases? Or is it a normal scheduled backup?

Disabling the option to verify the backups might speed things up a bit, but introduces a small risk.

Posted

Thank you for your responses. I hate to double-post, but we also asked on TechNet, and were also asked to revisit our need for an hourly backup. The system has just rolled-out, and so hourly backups seem necessary. But it's a point worth revisiting.

The backup takes 1-2mins, and by its nature it explicitly pauses, Vaughan. What setting am I missing? I'm not sure we'd want to disable verify, Vaughan, for obvious reasons. As you'll read on TechNet, looking at the entire server to storage device chain was suggested. Perhaps optimizing the pipe will speed the backups.

However, I'm not alone, huh? A running backup will leave PHP clients without the ability to connect for the duration.

Posted

FYI, the fatal error arises from a > 30 second request with no response, but you probably knew that. You can adjust it if you deem necessary in the API, alternatively, you could implement caching.

Posted

That is how backup is performed. I have get the same result before.

The only option I can imagine (but haven't realize because I go to backup once per day) is set another field where content appear when backup tart (a bit earlier) and disappear after backup.

This field is checked by PHP and during backup browser cannot change content and warning "be patient..." is on.

This can slow down PHP a bit but it is better then get fatal error.

  • 3 weeks later...
Posted

Caching, Genx? Can you tell me more about this.

We've switched to using shell scripts and the backup now takes 7 secs. Hmm.

  • 1 month later...
Posted

Caching involves storing the results of the response to FileMaker queries locally on the web server and referring to those rather than the FileMaker server in certain circumstances - normally its used to increase the efficiency of the application and avoid taxing the database server unecessarily (e.g. if the cache is older than 30 minutes then the script would request data from filemaker again, and then overwrite the cache - if the cache has been written to in the past 30 minutes then the script simply reads the result from the cache instead of the filemaker server).

In your case you might have it write to the cache after every request and have it default to the cache if the FileMaker server times out. Don't know if it's worth it for the 7 second loss but it depends on how critical the application is.

This topic is 5419 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.