Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 4337 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

For some analyses, I have a script that is working fine determining the number of records meeting various constraints, but the finds seem to slow things down.

For example, out of ~3500 total records, I can narrow down the found set to the ~40 records that correspond to one parameter (determined through some related tables) and then the ~20 of those that meet a second criteria, e.g., a field called due priority contains "Overdue".

 

The script uses combinations of finds, extend finds, or constrain finds to count the various subsets.  Am I just being lazy?

 

• Should I just add a calculation to each record and then sum up the new field to determine the counts for the subsets with an aggregate or summary kind of function?  It seems as if that might be faster since it would be restricted each time to the initial found set of ~40 records.

 

• Or is there another recommended philosophy for this?

 

Thanks,

Bruce

Posted

I usual create a flag field calc that contains the logic I need for the "state" of a record, if possible. I use this flag in counts and finds. If I often need a find, and this calc is unstored, I create a stored version of this calc, and set it in a transactional script.

Posted

Right, I'll create the calculation fields and then determine the numbers in each state and avoid the excess finds that I was heading towards.  Thank you.

This topic is 4337 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.