Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×
The Claris Museum: The Vault of FileMaker Antiquities at Claris Engage 2025! ×

This topic is 5397 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

Hi all!

I'm not sure I'm going to be able to clearly explain what I'm looking for here, but I'm going to give it a try.....

I'm looking to replicate the power of using Finds to filter a list of records in a portal. I frequently use a global field in the header of a layout that lists records in a List View Body part to allow my users to enter some text, including partial words, in order to filter that list of records. As they type in that global field I simply run a find that uses exactly what they have typed. Very simple. Very powerful.

Even with a partial word typed, FileMaker's search engine will find records that have words in the field being searched that start with the partially typed word. For example,

If you type "Log" into the search field in a find request, FileMaker will return records that have the following text in the search field:

"Login please"

"Don't Logout"

"This is a log"

This is great because it finds records with words that have any portion of "Log" in it and in any position in the text.

I'm looking to replicate that behavior in a portal filter. In other words, rather than using a layout with a List View Body part and a header with the filter field, I want to place a portal on a layout with the filter field above it. I have successfully been able to replicate the behavior, but it requires a key field, in the table whose records are shown in the portal, that stores every permutation of the words in the search field.

The problem there is that the more words in the search field, the more permutations, and the bigger the index, and the longer it takes to create the permutations and index. This all results in a long, long, long time to import records into that table and a huge file size for not much data.

Am I missing something? Is there a more efficient way to replicate this behavior in a portal filter.

Any help would be appreciated. Let me know if I need to clarify anything.

Thanks!

Posted (edited)

I haven't tried this, but maybe a hybrid technique would be an acceptable compromise solution for your situation.

Use:

Global field for filter string (gFilterString)

Global key field for one side of relationship match (gMatchKey)

Portal using relationship matching key

Table of records being filtered has:

Text field key for the other side of relationship match (MatchKey)

User types into your gFilterString to filter.

User pushes a button or a script trigger operates.

Assign gMatchKey to a unique identifier never used before by this script (could simply increment the previous gMatchKey value).

Script performs a find operation on the target table using gFilterString.

In the found set, Replace Field Contents on MatchKey so that it holds the value of gMatchKey.

View the results in the portal.

The inefficiency in this technique comes in with the requirement to have a Replace Field Contents on the found records. If the number of records you are displaying in the portal is modest, this may be efficient enough for your purposes.

Edited by Guest
Posted

The inefficiency in this technique comes in with the requirement to have a Replace Field Contents on the found records

In a hosted system this will not work as you might have multiple users filtering at the same time.

Posted

I would continue making searches fired by an event trigger, but then build the portal via the tail-recursion CF used here:

http://www.databasepros.com/FMPro?-DB=resources.fp5&-lay=cgi&-format=list.html&-FIND=+&resource_id=DBPros000663

Take a look at the attachment!

--sd

globalSea.zip

Posted (edited)

Søren has the better solution in most cases.

Gathering up the IDs of the found set and assigning it to a global key is relatively fast and should have no problems in a multi-user situation.

It is limited by the number of records in the found set though. If I remember correctly around 50,000 records will max out the recursion.

On slower machines, you may see a delay which won't have a progress bar as it performs the recursive calculation.

Alternate scripted methods of gathering the IDs without a recursive custom function are available if that is an issue, so you don't have to throw out the baby with the bath water.

Edited by Guest

This topic is 5397 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.