Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 4771 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted (edited)

Databse MAIN.fp7 records (example w/ 3 records)

................................#1....................#2.............#3

End Customers: |Agil...............|Agil................|Agil

Sub Customers: |Enterprise |Enterprise| |Local

Circuit Number:|01-aszz-222|01-aszz-554||01-as

While in MAIN.fp7, layout from MAIN databas

Want to perform search that only returns

unique records base on Enb Customer and Sub

Customers.

Desired results from search- two records:

Agil, Enterprise

&

Agil, Local

do not want to see the "other" Agil, Enterprise record.

Can someone help me?

Edited by Guest
Posted

I've been through this problem on the former forum, and got lots of help: the answer, there is no easy way to get "distinct" records. For example if your records are:

blue

blue

red

blue

red

red

yellow

blue

I assume you want to do a search that returns only the following:

blue

red

yellow

No, I am told FM8 has no way to do that with a quick search. Several people wrote complex scripts for me, but I ended up using a different database for that portion of my needs. It seems so basic that you would import huge amounts of data and want to create a table of only "distinct" values. I use that table for the parent table.

Fred

Posted

thanks for the reply-

you are correct-- i want to see red, blue and yellow..

unbelievable they do not make this easy!

thanks,

nat

Posted

You may be able to normalize your tables, so that those distinct things are in a separate table. This sounds like what Fred has done to solve his issue.

The other option is to use a columnar report with sub-summary parts. I can't tell from your description how the data is meant to be grouped, but a search of the forums for +sub +summary +parts should yield some examples.

Posted

After you perform a search, I could write a script that omits any duplicates records. The question is, which duplicate record do you want kept in the found set? Do you care? Do you want the first one created? Do you want the last one created? The following script will omit all duplicates except the first created. Let me know if you want something different.

The screen shot shows the script. You sort by the color field to get evrything in order and then loop through all the records, omitting duplicates.

FileMaker_Pro_AdvancedScreenSnapz001.jpg

Posted

Sadly, what Fred has done (me) is run the main table through another database that has a "find distinct" option in the find menu (Lotus Approach - now owned by IBM, but probably an orphan by now).

There are many, MANY instances where a user needs to extract DISTINCT records to create a main table. Mostly this is when you are starting with imported data based on all records. Most mainframe "pullfiles" will give you multiple instances of one field (with a new duplicated record each time).

For example, I ask every bank client to download in a pullfile all of their NOTE data, so I have one record for every note. That's great, but most BORROWERS (to be my main table) have multiple NOTES. So, out of 4,000 records based on NOTES, their may be only 3,000 records based on BORROWERS, since one borrower might have 2 or more notes.

So I always have to take the NOTES table, and extract all the distict BORROWER records, and then export the result to another table called BORROWER TABLE. I have linked the BORROWER TABLE to the NOTES TABLE using a one-many link, with BORROWER as the linking field. Problem is, when trying to search and extract only the distinct borrowers for that table, I can't use Filemaker. I have to use Approach database and export the data to be imported into my Filemaker database. WHAT A PAIN! Especially when all we need is a "find distinct" in Filemaker.

Sheesh, I got carried away!

Posted

Hi there,

One way to find distinct in FM is to set a selfjoin relationship between two Notes TO's of Borrower::Borrower, and then create a calc field "cDistinct":

case(Borrower::ID=ID, 1)

A find with 1 entered into cDistinct should pull up only 1 note for each Borrower.

-Raz

  • Newbies
Posted (edited)

Hi there,

One way to find distinct in FM is to set a selfjoin relationship between two Notes TO's of Borrower::Borrower, and then create a calc field "cDistinct":

case(Borrower::ID=ID, 1)

A find with 1 entered into cDistinct should pull up only 1 note for each Borrower.

-Raz

I don't see how this would work, wouldn't it just return 1 for ALL records?

Say you have a table "Census" And two fields "Year" and "ID" Where each individual had a unique ID, but would have multiple records corresponding to multiple years.

You would be looking up from Census to Census based on ID, so of course ID=Census::ID. The case will always be 1. Every record will be cDistinct=1. If you do Year=Census::Year, you're buying into a logical fallacy.

There are two easily automatible solutions I've found two this problem. Neither do I like. One is to sort, go thru the records, and build an array for the unique values. The second is the previously mentioned loop and omit. The first option is much faster, but leaves you then having to figure out what to do with the data. Great if you're adding it to another table, otherwise, not so much.

There's always etl thru a table where ID is defined to be unique, but that's no fun.

Edited by Guest
Added bit about Year=Year
Posted (edited)

It is an old trick (sometimes used with Min or Max function) but does come in handy. Try it out. Basically, it compares a records unique ID to the Unique ID of the first record in a relationship (depending on the sort order of the relationship). If the relationship was within Notes from Borrower::Borrower, there would only be a single record for each unique borrower that would find their ID as the first record through the relationship.

Works like a charm.

Edited by Guest
  • Newbies
Posted

If the Records in the original posters scenario don't have a unique ID, I can define a field for that, but that wasn't the solution to the OP.

I understand what you're getting at, basically thru the self join it will compare two values (the record you're currently in, and the first record returned from the self-join) logically this should only be true once in any set returned from a querry. If you do have a unique value you can use this comparison, if you don't, you either have to create one or find another solution.

In large tables (and really in general) it's good practice to limit the number of fields, especially when they're not directly needed to manage the data, or part of the data being managed.

All that said, I will probably at some point use the solution (I may have even, in the past).

Posted (edited)

Sorry bout that, all of my ID's are unique. It is good to point out that each record needs a unique ID specified for it, although this should probably be standard practice anyway.

I agree with the point about limiting the number of fields, but this still seems much simpler than what poor Fred was going through... and still the simplest was I have found to identify unique values. Would love to put it aside in favor of something better though.

Edited by Guest
  • 5 years later...
Posted

It's been more than than fives years since this thread was started. Is there finally a way to do this in FM11 without resorting to adding tables, looping scripts, etc., etc.?

Posted

It's been more than than fives years since this thread was started. Is there finally a way to do this in FM11 without resorting to adding tables, looping scripts, etc., etc.?

Please do not double-post:

http://fmforums.com/forum/topic/80933-how-can-i-perform-a-find-that-returns-only-unique-records/

This topic is 4771 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.