Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×
The Claris Museum: The Vault of FileMaker Antiquities at Claris Engage 2025! ×

This topic is 7250 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

How does one determine the number of unique values in a field in a found set?

That's the question! I found a recipe in FM Help for finding duplicate values via a self-join, which might be extendible to solve this, but I wondered if there was a simple, direct solution. Odd that I can't find this question in the archives.

Gratefully,

Chap

Posted

How does one determine the number of unique values in a field in a found set?

That's the question! I found a recipe in FM Help for finding duplicate values via a self-join, which might be extendible to solve this, but I wondered if there was a simple, direct solution. Odd that I can't find this question in the archives.

Gratefully,

Chap

Posted

Thanks, Queue -

I may be mistaken, but the solutions discussed in that thread all seem to generate the distinct values in the entire database. What I need is the distinct values of a field IN A FOUND SET.

Techniques involving Insert from Index, or (SerialNo = SelfJoinRel::SerialNo), seem to be valid only for the entire database.

If I'm missing a subtlety, please enlighten me! :-)

Thanks,

Chap

Posted

Thanks, Queue -

I may be mistaken, but the solutions discussed in that thread all seem to generate the distinct values in the entire database. What I need is the distinct values of a field IN A FOUND SET.

Techniques involving Insert from Index, or (SerialNo = SelfJoinRel::SerialNo), seem to be valid only for the entire database.

If I'm missing a subtlety, please enlighten me! :-)

Thanks,

Chap

Posted

If you want have foundset, the most quickly way to get it is to export records to enother file choosing summary by your field. This tecnique works such as DISTINCT in SQL.SerialNo = SelfJoinRel::SerialNo works slowly when i tried.

Posted

If you want have foundset, the most quickly way to get it is to export records to enother file choosing summary by your field. This tecnique works such as DISTINCT in SQL.SerialNo = SelfJoinRel::SerialNo works slowly when i tried.

Posted

No, you are correct. Only summary fields operate on the found set. But a summary field will not give you a count of distinct. I believe you need to script this.

Posted

No, you are correct. Only summary fields operate on the found set. But a summary field will not give you a count of distinct. I believe you need to script this.

Posted

I use it, and using this technology i have a found set of records with unique values in product_Id field from lineitem file. You must choosing "summary by" when exporting.

Posted

I use it, and using this technology i have a found set of records with unique values in product_Id field from lineitem file. You must choosing "summary by" when exporting.

Posted

I hoped to find a general way to implement DISTINCT, but perhaps it's not worth it at this point. Here's the specific problem I'm trying to solve:

It's an Orders/LineItems database. I've done a Find against LineItems to get a found set of all purchases of a certain type of product. The question is: how many Orders does this represent?

I was trying to count the distinct values for the foreign key Order_ID. Perhaps, for this particular problem, there is a different approach I should consider?

Posted

Gee. I might not have thought of that grin.gif

Thanks. Quite weird, and slow computing the total of those fractions based on getsummary(), but it works.

For future reference, the post by Ender points to a technique for reporting a COUNT of DISTINCT values of a field in a FOUND SET

This topic is 7250 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.