Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 4821 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

I am trying create a single calculation field to combine these 5 text fields values with a custom delimiter (comma, bullet, etc.), but remove the keywords that are identical across the different tag categories.

For instance:

Adventure, Adventure, Horror, Horror > Adventure, Horror

I have created a music track tag database that allows me to pull genre and mood descriptions from a related table and apply to my records via check box.

I have the tags (keywords) arranged into different types of track tags, Arrangement (instruments), Industry (industry specific terms), Genre, Application (Music For..) and Descriptive. Some keywords are in multiple categories, such as Horror, which is an Application tag and a Descriptive tag. For my purposes the separation is good, but when exporting to a csv file the separation is unnecessary and confusing.

Thanks!

d

Posted (edited)

Do you have a list (or lists) of keywords anywhere?

---

P.S. Please update your profile to reflect your version and OS.

Edited by comment
Posted

I have a related table with all the keywords. This is what I use (along with a self-join) to populate my checkbox list. I sense a FilterListValue coming into play?

p.s. I have updated my profile info.

Posted

I sense a FilterListValue coming into play?

Indeed, if you have a list of all possible values, you can de-dupe your list using FilterValues(), e.g.

Let ([

bigList = List ( AllValues::Value ) ;

smallList = List ( Arrangement ; Industry ; Genre ; Application ; Descriptive )

] ;

Substitute ( FilterValues ( bigList ; smallList ) ; ¶ ; ", " )

) 

Another option is to use a custom function or - with only 5 values - just check each field explicitly against a list of previous fields.

Posted

OMG!

That was incredible! Works perfectly!

So tell me if my assumption is correct, you filter the smallList (containing duplicates) against the bigList (without duplicates)? Kind of ironic, seeing as how it is just reversing a portion of the work that you did in the first place! Very elegant!

Thanks again.

d

Posted

It's actually the other way round: you filter the bigList, leaving only values contained in smallList. Since the bigList contains only unique values, it doesn't matter if smallList has duplicates. Another effect of this is that the resulting values are always sorted in the order they appear in bigList.

Posted

Sure. I should have thought about that. Change the last line to :

Substitute ( FilterValues ( bigList ; smallList ) & ¶ ; [ "¶¶" ; "" ] ; [ ¶ ; ", " ] )

  • 2 months later...
Posted

Alright, so I have this working really well on my database.

Let ([

bigList = List ( AllValues::Value ) ;

smallList = List ( Arrangement ; Industry ; Genre ; Application ; Descriptive )

] ;

Substitute ( FilterValues ( bigList ; smallList ) & ¶ ; [ "¶;¶" ; "" ] ; [ ¶ ; ", " ] )

) 


 

However, I would like that the values from Genre is placed first.  I have tried, to simply place Genre first using

 


Genre & Substitute ( FilterValues ( bigList ; smallList ) & ¶; ; [ "¶;¶" ; "" ] ; [ ¶ ; ", " ] )

)

This dosen't work because it doesn't filter out the duplicate values. Any ideas?

Posted

FilterValues ( bigList ; smallList ) returns values in the order of bigList. If you sort the relationship YourTable x AllValues, the result will reflect that order.

Posted

Thanks for the tip. I am sure that this is going to come around to bite me, but I went and added a sort field in the bigList table, giving priority to my new Genre tags.

Thank you!

This topic is 4821 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.