January 8, 201313 yr For some analyses, I have a script that is working fine determining the number of records meeting various constraints, but the finds seem to slow things down. For example, out of ~3500 total records, I can narrow down the found set to the ~40 records that correspond to one parameter (determined through some related tables) and then the ~20 of those that meet a second criteria, e.g., a field called due priority contains "Overdue". The script uses combinations of finds, extend finds, or constrain finds to count the various subsets. Am I just being lazy? • Should I just add a calculation to each record and then sum up the new field to determine the counts for the subsets with an aggregate or summary kind of function? It seems as if that might be faster since it would be restricted each time to the initial found set of ~40 records. • Or is there another recommended philosophy for this? Thanks, Bruce
January 8, 201313 yr Solution I usual create a flag field calc that contains the logic I need for the "state" of a record, if possible. I use this flag in counts and finds. If I often need a find, and this calc is unstored, I create a stored version of this calc, and set it in a transactional script.
January 8, 201313 yr Author Right, I'll create the calculation fields and then determine the numbers in each state and avoid the excess finds that I was heading towards. Thank you.
Create an account or sign in to comment