Jump to content

Deleting duplicate records


This topic is 5719 days old. Please don't post here. Open a new topic instead.

Recommended Posts

  • Newbies

I try doing the self join relationship method described in help, but I get a message in the calculation box either that it cannot find the table or I am not entering data correctly. I want to delete duplicate records but not manually because there are many of them. Please help

Link to comment
Share on other sites

  • 3 weeks later...

we'd need more information to be able to help you.

You seem to be on the right road but without more information we can't figure out where the problem is.

I myself am trying to put something together but am running into a problem with my calculation. I'll eventually figure it out I'm sure.

Link to comment
Share on other sites

Create a calc field (cDupKey) to concat all the fields that you use as criteria to check what at a duplicate is. Then a script similar to this.

Sort Records [ No Dialog; cDupKey ]

Go to Record/Request/Page [ First ]

Set Variable [ $DupCheck; cDupKey ]

Go to Record/Request/Page [ Exit after last; Next ]

Loop

If [ cDupKey = $DupCheck ]

Delete Record [ No Dialog ]

Exit Loop If [ get(recordnumber) = get(foundcount) ]

Else

Set Variable [ $DupCheck; cDupKey ]

Go to Record/Request/Page [ Exit after last; Next ]

End If

End Loop

Link to comment
Share on other sites

  • 4 weeks later...

I tried the self-join technique to identify duplicates, too, but, following the instructions very carefully, I'm only able to get "duplicate" as the identifier for all records. There seems to be something missing in the explanation.

I'd like to use this as a simple automatic way to identify a duplicate during data entry without having to run a script. Presumably one could display the field value "duplicate" in a bright color and leave the unique value empty.

Has anyone successfully recreated this technique without a script?

Link to comment
Share on other sites

? am I missing something? I don't see a file linked to your response on that page, and you seem to be talking about an active search, not a passive relationship. Repeat: I don't want to run a script, I simply wish to see a flag when I have a duplicate, and the technique in the Help file suggests this is possible.

in troubleshooting, I displayed both related fields; the latter fails (which results in "duplicate") because it reports . Perhaps because my key field is a concatenation of values from 3 tables?

Link to comment
Share on other sites

[i created a dummy file with limited fields and technique actually works well]

OK - so a LITTLE more detail: I have a media library database. To keep it simple (=cheap for the client) it has one main table for most of the data, a media/category table to generate media-specific categories through the relationship; a subjects table of unique subjects; a join table to allow a separate portal to display related subjects; and a self-join, as in the FileMaker help example.

The unique ID field is in the main data table and is a calculation which is a concatenation of an id for the record, and id for category/medium, and an id for unique subject. The unique ID field is the key field for the self-join.

The self-join related unique ID displays the error , thus the CkDup calculation field always displays "duplicate" because the calc is If (count = self_join::count, "unique", "duplicate), where count is a serial number field, auto-incremented.

I'd just like to know what I'd look for to troubleshoot the error - is it the two additional relationships? is it because of a dynamic calc?

Link to comment
Share on other sites

  • 3 weeks later...

This topic is 5719 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.