Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 6602 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

Hello Everyone,

My current DB is used to import data from other sources regularly. The files that we import are tab-delimited text files. Nothing special about them, except that they usually hold 15,000 lines/records. The problem I'm having is that in a 15,000 line import, I don't always get 15,000 records. My most recent import was smaller (2319 records), but once imported, I only landed 2290 records. That is a loss of 1.3 percent of the records. Usually, I miss 1 or 2 in a 15K import.

If I delete the found set after the import and cast my line for them again, I usually can land all 15K. However, this is quite unacceptable in terms of data loss. Every record that is not imported is a customer that doesn't receive the requested item. 1.3 percent of customers not getting their item is going to translate into a horrible customer service rating.

What can be happening here? All of the records are unique, so there should be no dropped duplicates.

Are there practices I can put in place to ensure that ALL records are grabbed on the first import attempt?

As always, thank you in advance for your time and assistance.

Mac Hammer

Posted

It might be a problem with your db or with the text file. To distinguish between these two you can create a fresh empty fm test db with the necessary fields for the import.

If the problem occurs also when importing into the fresh db it is likely to be a question of the text file. In the other case your db is the origin.

-jens

Posted

Hi MacHammer:

Is there an auto-generated Serial ID field in your table? If so, is it set to generate the ID on creation or on commit?

FWIW, I've seen this problem solved by setting the Serial ID to generate upon creation.

Posted

YES! I do have the records get timestamped and I create a record number. Is commit better? I need to track this stuff (time/record number) as part of the database management, but if "on commit" would work better, I'm certainly open to that. I'm not familiar with how the two are different, but if they achieve the same thing, but would allow for a more reliable import, I'm all for it.

Thanks! I look forward to any more you can shed on this.

Mac

Posted

Hey, Mac!

You could try it both ways, but I was hoping you would say your ID field is currently using "on Commit" ... then hoping that changing it to "on Creation" would solve the problem. (Kinda stringin' together a bunch of hopes, huh? Well, I'm generally a hopeful kinda guy.)

Greatly abridged history ... Before FM7, Serial ID fields were populated at the moment a record was created. Lots of panties were bunched when users would then immediately delete the record, or simply bail out of the creation process, because it left a "gap" in the numeric order of the Serial IDs, and this was deeply uncomfortable for many developers (exactly why is debatable). This led to a host of work-arounds and some functionalities being added by FM.

Nowadays, we can choose whether to generate/populate a Serial ID when (1) the record is merely created or (2) the user creates and commits the record, be it through FM's natve commands or through a scripting routine.

That's about all I've got. Hopefully (there it is again), some of our gurus could steer us both toward a resolution of the problem. I would like to know, myself!

This topic is 6602 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.