Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×
The Claris Museum: The Vault of FileMaker Antiquities at Claris Engage 2025! ×

This topic is 6656 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

Hello Everyone,

My current database project is made up of read-only data imported from fixed-field text files and all fields are made of calculations that grab certain characters from the line of text using the Middle calculation. So, no data is modifiable because of being based on a calculation. My problem is that most of these text files contain an even 15,000 lines or records. I have one import, however, that shows 15,001 records, despite its corresponding text file only having 15,000 lines (i.e., there is a duplicate record that snuck in somehow).

Without visually scanning all records for the duplicate, is there an easy way to find a duplicate? I'm currently experimenting with validation on the import field to ensure that each line of text is unique, but I'm afraid that this might slow an import that already takes 10 minutes to an even longer wait, but it might be a failsafe that is necessary. This only helps the future imports. I still need to find a way to identify the already created duplicate. I know that I can find the imported records by their import number (a field created in the table) and delete it and re-import it, but this seems a waste if there might be a way to capture the duplicate and then delete it.

Any ideas will be appreciated greatly. Thank you in advance for your time and attention.

Mac Hammer

Posted

Go to Find mode, display the status area, and popup the list of symbols. "!" is the one you want.

Posted

Another thing to check is whether you have a blank record that was accidentally created before or after you did the import. Quick way to check is unsort the table then check the first and last records.

Posted

Thank you all.

I had done the sort to look for a blank record, but I was not aware of the "!" use in a Find request. I've used it and it found duplicates of another variety not related to my exact records, but I've got the right tool now, so I just need to search the right field and I'll be set.

Gracias Amigos!

Mac Hammer

Posted

Hey guys,

I have a similair problem, i want a script that will automatically delete any duplicate records in a specific table. im sure it can be done i just cant figure it out. Ive been looking around all posts and this was the closest one i found related to what i need to accomplish. thanks ahead of time.

Posted

Although I haven't tried this, it seems easy enough. First, perform a find using the "!" to see a list of your duplicate records. Then, build a script that includes a find. In the find what section, select your last find as what you want to find. Then, issue the script command to delete found set. You might want to build a custom dialogue or four into it that asks if you "really, really, really, really" know what you are doing before you allow the records to be deleted. }:|

Best of luck,

Mac

Posted

Since you guys are on FM 85. Do a search in the help section for "Finding duplicate values" including the quotes. It will give you some info.

Also there should be lots of past posts regarding omitting / deleting / marking duplicate records etc in these forums. Here is one technique.

Sort Records [ No Dialog; fieldWhatever ]

Go to Record [First]

Set Variable [$DupCheck; fieldWhatever] (Set a global field if Using FM7 or prior versions)

Go to Record [Next]

Loop

If [fieldWhatever = $DupCheck]

Omit Record

Else

Set Variable [$DupCheck; fieldWhatever]

Go to Record [Next; Exit After Last]

End If

End Loop

This topic is 6656 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.