Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 6591 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

Hi everyone,

I'm working with a database solution here at work where we track a lot of client data. We have a request file that gives us a list of clients who have requested our product. This table, with data entered by import, now contains nearly .5 Million records. Then, we have another table in the same file, that contains records showing how we processed those requests. This table also contains nearly .5 Million records and should, in theory, have the same number of records as in the request file. There are also other tables that contain smaller numbers of records used to help wrangle the data between these two primary tables. All totalled, we have somewhere slightly south of 1.5 Million records in the database file, and 10 different tables.

My problem is this. The request files get broken down into files that contain no more than 15,000 records, each. So, we number these files as they are processed and I can search the database for all records contained in a certain numbered file. I have also created an accounting layout where the layout calculates the number of records in each file and displays them on one report so I can verify that we have ALL records in each file. There are approximately 55 individual import files, totalling the .5 Million records in the database table. I am losing records. As I look at the accounting layout, I will find an import file that contains 14,999 records, instead of 15,000. So I will re-import the data, asking FMP to update existing records and to add new records found in the import. This will bring that file up to its 15,000 records. I'll move down the list, doing the same process. After a few imports, I'll look back in the list and see that one of my imports now contains fewer records than expected AGAIN. Records have been lost. Sometimes as many as 6000 records are now gone. My goal is to account for all records in all 55 files, but I can never get there. By the time I even get close to being there, I look back and see that I've lost records again and have to start over.

Am I maxing FMP out? I thought that I'd read that FMP can hold data into the terabytes and we are no where close to that sort of file size. Perhaps I'm wrong? Please help me figure out why I'm losing these records. I really need your help!

Thank you, as always.

Mac Hammer

Posted

I think its far more likely that you have a bug in your scripts or the procedure being used. If you've really checked over everything, you could try recovering the file to ensure its not corruption, but from the sound of it I really doubt it is.

Posted

Thanks Shadow,

I'm going to take it off line right now and do the re-build and see what happens. I wish it were possible that this was a scripting error, but in this instance, I'm manually importing all of this data so there is little room for a missed script step. I'm really hoping the rebuild will help.

Do you have any info on max file size? I can't seem to find that listed.

Thank you for the help.

Mac

Posted

After performing the database rebuild, a damaged field definition was fixed. I'm going back at my import efforts to see what happens. In all my time of doing database rebuilds, I have never seen that particular error say it was fixed, so I'm hopeful...

We'll see.

Mac

Posted

More info:

I may have figured out the problem. When I see a discrepency in the number of records that should be in each run file, I re-import the file, telling the import to update existing records. However, I have been allowing it to import with the entire data set found. It appears that this can mess things up. I have now revised my process so that I find the existing records in the run file in question, and THEN import, updating the existing records. It 'seems' to be working now. I'm getting further into the reconciliation, so it seems to be holding its data better.

Mac

Posted

Import/updates can also fail if any of the records are in use during the import step. The import summary dialog box should point this out, however when it happens.

This topic is 6591 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.