fchieli2 Posted January 26, 2008 Posted January 26, 2008 Hi, I've benn pulling my hairs with a particular client of mine. I've developed a solution for them, simple stuff, 5 tables. It's hosted on their network, but I don't have access to the server. I know the server is version 8 or better. Now here's the problem: from time to time huge blocks of records are disappearing, consistently and in record ID order. So every record from rec ID 0000 and 3000 are now gone.Anything after 3000 is there etc...Then a couple of months later everything from 3000 to 5000 goes! Now I have to exclude employees deleting records, I've created a tracking system that tells me who deletes what, I've trained them over and over. I checked every (very few scripts involved) and every relation (none of them will trigger a related record deletion). I'm starting to think there's something wrong with the server or with the way is getting backed up. Their IT person uses Retrospect, which I don't think is the way to go. Could the backup software or a crash or something be causing the records loss? The weird thing is that all of this is happening only on one table...
Steven H. Blackwell Posted January 27, 2008 Posted January 27, 2008 Well you would not want to use Retrospect to back up live, hosted FileMaker Pro files. BTW, thre is no Rec ID 0, but I understand what you're saying. Check you scripted imports, if you have any, to be sure they are not overwriting existing records. Steven
LaRetta Posted January 27, 2008 Posted January 27, 2008 I just want to be sure ... It's hosted on their network, but I don't have access to the server. I know the server is version 8 or better. You don't have FMS on the network server do you?
fchieli2 Posted January 27, 2008 Author Posted January 27, 2008 There's nothing being imported. The records are being entered manually by a bunch of employees...Once a record is entered it should just stay there, there's no importing or deletion scripts affecting that table... Once the problem started I addedd a calculation field that gets the record ID of records in the affected table, just to see which records where being deleted. With this I found out that when the problme appears, it affects only older records, in blocks...it's almost as if the server couldn't take anymore data and had to make room by dropping the old one. I know two things about the server: it's backed up with Retrospect (their IT person emailed me that and it's hosting many many different solutions). Could the server be underpowered and drop data lie that? Could my file on the server be corrupted and drop data consisently?
fchieli2 Posted January 27, 2008 Author Posted January 27, 2008 Not that I know of, I asked the IT person the same question...
LaRetta Posted January 27, 2008 Posted January 27, 2008 1) You cannot trust RecordID because it will NOT increment by 1. It can jump from 17000 to 36242 in one record. 2) "I know two things about the server: it's backed up with Retrospect (their IT person emailed me that and it's hosting many many different solutions)." Very very bad. What 'solutions' are being hosted? If it is filemaker files then they ALL have problems if they are backed up with Retrospect. Again ... we need (and YOU need) to know specifically how you are set up. If you cannot tell us this information then you don't know enough and it is very dangerous because obviously the IT person hasn't a clue how FileMaker needs to be handled when networked. You are risking your data by not knowing the answers.
Steven H. Blackwell Posted January 27, 2008 Posted January 27, 2008 You cannot trust RecordID because it will NOT increment by 1. It can jump from 17000 to 36242 in one record. This was true in older versions, although the increment "jumps" were well known and specifically defined. In the .fp7 format, however, the Rec ID's do start at 1 and increment by 1. Cloning a file will reset this. If you're seeing consistent breaks in the increments, then there is something else going on in the file. Steven
LaRetta Posted January 27, 2008 Posted January 27, 2008 I haven't read a thing about this! Thanks for letting me know, Steven!
fchieli2 Posted January 28, 2008 Author Posted January 28, 2008 (edited) They do increment by one. Now all that is left in the system is records with record ID above 8134. When they started using the solution first record ID was around 1000 or so. So when the data loss happens, it always affects older record in blocks, I can affirm this because I see that the records that survived are perfectly sequential (8134, 8135 etc...). A little bit about the solution itself: I have an Inventory table a Loan table and a Borrowed items table. Inventory contains clothing items, when they get loaned the employee creates a loan, with basic info etc...and then adds items from inventory to the loan. This actually creates a line item that exists in the Borrowed Items table. The records in the Borrowed Items are the one disappearing. No one can access the Borrowed Items table records directly. They're visible only inside a portal on the Loan layout and they can only deleted via button/script, one by one (no delete found records available).The script also logs who has deleted what and there's no record showing anyone deleting more than a few items here there(if they add items by mistake). No record can be deleted via relation. So I narrowed down this to: Server Problems (Backup related, Server crash or memory problems or the server hitting his limit of databases/records being hosted). or Their IT guy messing up and trying to restore and reimport data after a crash (this though I will never know, because and I don't have access to the server...) Thanks for your assistance! Edited January 28, 2008 by Guest
Recommended Posts
This topic is 6143 days old. Please don't post here. Open a new topic instead.
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now