Jump to content
Server Maintenance This Week. ×

FileMaker Pro 9 - database recover fails silently


This topic is 5850 days old. Please don't post here. Open a new topic instead.

Recommended Posts

  • Newbies

We are using the recover feature on FM Pro 9 to periodically remove uneditable, blank records from a large database. We were surprised to find that the recovery process did not recover the database consistently. We performed several tests of the recover process using the same corrupt file and running recover on different computers. We observed that recover removed the blank records on some computers, but not others. Then we discovered that recover was running out of disk space on the computers where it didn't remove the records. When recover ran out of disk space, it cleaned up its temporary files and displayed the recovery complete dialog - no sign of trouble. We also found that recover needs 4x more temporary disk space than the file size it is recovering (13GB in our case). We observed the same recover behavior on versions 7,8 and 9.

Link to comment
Share on other sites

The Recover function should not be used for this purpose. It is meant to recovery files that have Failed (i.e got corrupted, and will not open).

Lee

p.s.

Have you tried the File Maintenance Tool?

Main Menu >> Tools >> File Maintenance.

Do a find for the Tool name and see if there has been any discussion about it. I kind of remember a caution about using it.

Edited by Guest
Link to comment
Share on other sites

I'd suggest skimming through help files and the knowledge base on the FileMaker site for a better understanding of what recovery does and is meant for. I'd also look for possible causes of the corruption, like antivirus software running on open database files or on backup files while a backup is running, etc.

Are you trying to Save A Copy as Compacted first when these issues crop up? It sounds like your blank records could be a result of corrupt indexes, which a compacted copy should help eliminate.

Lee, yes, the issue with File Maintenance is that it modifies the open file, so if it's run on a truly munged up file, it can make it even worse. Save A Copy as Compacted does the same File Maintenance, but on a separate copy of the file, leaving the original untouched.

As to the actual recover inconsistencies noted above, I can't speak to that. It's interesting, though.

Link to comment
Share on other sites

  • Newbies

Thanks, guys. The Save a Copy compressed does not correct the problem that we are seeing. The problem is very rare - 5 occurrences in 9 months, about 2500 inserts into the database. We have staff in 15 offices across the state accessing the database. The problem seems to be network related (connection glitches?), but it is difficult to troubleshoot due to its rarity. Our strategy is to figure out the most reliable way to fix the problem and then monitor the system to see if a pattern develops over time.

I wanted to report recover's odd behavior to the forum. I expect the tool to check that it has enough disk space before starting. Or at the very least, to report that it encountered problems and that recover ended abnormally.

Link to comment
Share on other sites

It could be network dropouts, though as you say that's hard to diagnose. If users are tunneling through a VPN rather than, say, logging into a terminal server or citrix, then there's an increased chance of dropped/lost packets.

But just to reiterate, since you don't specify your whole maintenance/repair process - you shouldn't be re-hosting recovered files. You should be using Recover to obtain a data set that you can then import into a last known good clone of your file.

You probably already know that, I know - just making sure.

Link to comment
Share on other sites

  • 4 weeks later...

I've encountered the same issue. IT seemed to be coming from users working on the file during a backup. I know this SHOULDNT be an issue but it was. 13GB isnt that large compared to the fact that each DB can hold 8TB, however a 13GB file will take a while to backup. 1st what I would suggest is to schedule backups when no one is expected to be using the file (if at all possible). Next take the file and OMIT the phantom records and then import them into a clean empty file. That will eliminate the records. Just a suggestion, let us know if that works.

Link to comment
Share on other sites

This topic is 5850 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.