Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 7289 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

Hi

My question is how can you produce a list of values, without any duplicates?

I've tried using the ! exclamation mark find symbol with the 'omit' tickbox selected but that omits any occurrence of duplicates.

Say if there were four company names the same you would not get any of them.

I want to generate a list of company names from our database but with each one appearing only once.

Is there an easy way to do this?

I've never been able to work out the search syntax for that.

I realise I could set them up as a value list based on the company name field but I'd like to know if it's possible this way.

Also it means the company names are sorted in alpha order and I want it to have it based on another field in the database.

One of the reasons is that I want to be able to send this find request to the web companion to return a list of all company names, but without companies appearing more than once.

Any ideas?

Thanks

Ben

Posted

Duh. Sure this help ! mad.gif

Sorry, don't have time myself right now, as I'm leaving the office, but others surely would.

Is it for a report, or just a list you want to use elsewhere ?

Posted

Ideally, you would have a Company file that contains one record for each company. Less ideally, you can create a self-relationship based on company name and a calculation of serial = selfrelationship::serial and then do a find for 1 in this field. Note that a company name can have words no longer than 19 characters and a maximum of 60 characters, or else it may omit similar company names.

  • 2 weeks later...
Posted

Less ideally, you can create a self-relationship based on company name and a calculation of serial = selfrelationship::serial and then do a find for 1 in this field. Note that a company name can have words no longer than 19 characters and a maximum of 60 characters, or else it may omit similar company names.

Could someone elaborate on this a little bit?

I have a database of client projects, where each project has the complete client name, address, and "attention" field, which creates a lot of duplicate information. (Not the ideal way to set up a DB, but this was a pre-existing setup.) I'm trying to create a layout where I could view each "attention" only once (sorted by client). From this layout, I'd then select certain records using a checkbox, and print mailing labels for those selected.

The above solution seems like what I'm looking for, but I'm not entirely clear on the terminology (i.e., selfrelationship, serial) so if someone could break it down into simpler terms, I'd really appreciate it.

~Courtney

Posted

Serial would be the unique identifier for each record in your file. A self-relationship means a relationship from a file to itself, in this case, from the AttentionName field to itself. Once the relationship is in place, create a calculation field with a number result, equal to Serial = selfrelationship::Serial. This field will return a 1 only for records where the Serial for the current record equals the first related Serial in the file. Since the self-relationship is based on AttentionName, this would mean that the current record is the first instance of that name in the file. Any duplicate records for that particular name will return zero in the calculation. So you can perform a find for 1 in that field (or script a find for 1) and it will return only records that it deems to be unique, based on the limitations described above.

I would attach an example, but I no longer have a copy of version 4, sorry. Feel free to ask if you need any more help.

Posted

Well now I've run into a quandry. Several of the "Attention" fields are "Accounts Payable" so of course, those got omitted as duplicate entries.

I tried creating a calculated field equal to Attention&Client Name and changing my selfrelationship to check that calculated field for uniqueness instead of just the Attention field alone, but it weeded my list from 350 down to 15, and I couldn't see what the 15 that remained had in common or different from the others that would make it do that.

Any thoughts on something I might be overlooking?

~Courtney

Posted

Try Attention & "_" & Client Name. FileMaker only indexes up to 20 characters per word. So you need a space or an underscore word separator to ensure that you aren't cramming too many characters together.

And just to clarify, the serial field should be the one you are testing in the calculation. The Attention & "_" & Client Name field should be the key field for your self-relationship.

Posted

I tried it with the underscore and without, and also tried switching the order of the fields. I ended up using the "City State Zip" field, and that seems to have worked for the most part, though it's still excluding a couple of records for no obvious reason.

I will keep fiddling with it, but this gets me through the first of the month mailing crunch.

Thanks for your help!

~Courtney

Posted

If two or more City State Zip fields match up to the 60th character, then only one will show as unique and the others will be duplicates, even if they're really not. The same goes for the Attention_Client Name calc. You could try shortening the Client Name, so that if the calc exceeds 60 characters, you at least have a way of making them semi-unique. Something like Attention & "_" & Left( Client Name, 10 ) & "_" & Right( Client Name, 10 ) should help, assuming your Attention doesn't exceed 38 characters.

Posted

Finally figured out my problem - the Calculation Result was set as a number instead of text.

Thanks for all your help.

~Courtney

This topic is 7289 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.