Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×
The Claris Museum: The Vault of FileMaker Antiquities at Claris Engage 2025! ×

This topic is 5940 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

Ok, I have a following calculation in my file, the result is Text.

Case (



Item_30_Day_Performance ≥ Settings::Inventory_Hot ; "HOT" ;

Item_30_Day_Performance ≥ Settings::Inventory_Average and Item_30_Day_Performance ≤ Settings::Inventory_Average_II ; "Average" ; 

Item_30_Day_Performance ≥ Settings::Inventory_SO_SO and Item_30_Day_Performance ≤ Settings::Inventory_SO_SO_II ; "So So" ;

Item_30_Day_Performance ≥ Settings::Inventory_DC and Item_30_Day_Performance ≤ Settings::Inventory_DC_II ; "DC" ;

""

)

The calculation works just fine without any issues while browsing the Item records however when I switch to list view it gets rather slow. The calculation is based on another calculation so I am guessing that is what makes it slow. I hope there is a way to make it faster or to make a better calculation which will produce same result. I will describe my setup here.

Two tables involved with the calculation.

Items table - 966 records

Line_Items table 20978 records

Line_Items table had the following fields which are relevant to my calculation:

1. Date_Constant - global number field which defines number of days

2. Date - date that the item was sold on

3. Current_Date - Get (CurrentDate)

4. Date_Calc - Current_Date - Date

5. Performance - Case (Date_Calculation ≤ Date_Constant ; Sum (Item_Quantity) ; 0 )

In words:

1. Number of days that I want to go back from current date to check how many items sold. If I want to know how many sold in last 30 days it would contain 30 or 15 if I wanted to know for last 15 days.

2. Date on which the item was sold

3. Current Date

4. Calculates if the date the item was sold on is within 30 days or not, the result of the calc is set to be a number

5. If the item is within the defined 30 day range it will sum the quantity otherwise it will just put 0 into the field.*

Items table had the following fields which are relevant to my calculation:

1. Item_30_Day_Performance - Sum (Order_Line_Items::Performance)

In words:

1. It will simply Sum Performance from Line_Items table

My main calc will then use that sum to determine in what range the item falls in and will return the appropriate hot or DC based on quantity of sold items in defined range.

I am guessing that since I have three calculations and each one requires the other to calculate things is why it is getting so slow. Any idea how to fix it so that it is faster or make a whole new calc more efficient calc?

---

* I guess here I do not need Sum I could just do Case (Date_Calculation ≤ Date_Constant ; Item_Quantity ; 0 ) ?

Sorry for the long post, wanted to explain as best as I could, any questions please feel free to ask.

Thanks.,

Posted

I'm not sure why there is a "_II" field for most settings, if this wasn't there the functions could be simplified significantly. It might also be able to take advantage of FMP's ability to short circuit case functions, so the calculation stops when the first true statement is found. In this case I'd work out whether there are more "hot" products than not, and structure the calc accordingly.

Getting rid of the "_II" option the cal would be:


Case (



Item_30_Day_Performance ≥ Settings::Inventory_Hot ; "HOT" ;

Item_30_Day_Performance ≥ Settings::Inventory_Average ; "Average" ; 

Item_30_Day_Performance ≥ Settings::Inventory_SO_SO ; "So So" ;

Item_30_Day_Performance ≥ Settings::Inventory_DC ; "DC" ;

""

) 




It might be possible to save a few cpu cycles by using a let statement to minimise field calls, but this needs to be tested. I'd only do one change at a time to see what effect each change has:





Let( 



performance = Item_30_Day_Performance ; 



Case (



performance ≥ Settings::Inventory_Hot ; "HOT" ;

performance ≥ Settings::Inventory_Average and Item_30_Day_Performance ≤ Settings::Inventory_Average_II ; "Average" ; 

performance ≥ Settings::Inventory_SO_SO and Item_30_Day_Performance ≤ Settings::Inventory_SO_SO_II ; "So So" ;

performance ≥ Settings::Inventory_DC and Item_30_Day_Performance ≤ Settings::Inventory_DC_II ; "DC" ;

""

) 



)

Posted

Thanks.,

The _II is there to define a quantity range for the calculation.

Inventory_Average would be, for example, 20 and Inventory_Average_II would be 50. So if Performance is 47 the Item_30_Day_Performance would return SO SO thus letting me know that the item is doing so so.

It might also be able to take advantage of FMP's ability to short circuit case functions, so the calculation stops when the first true statement is found.

I did not know that. Does the calculation evaluate the statements in order in which they are put into the calculation?

Case (



Item_30_Day_Performance ≥ Settings::Inventory_Hot ; "HOT" ;

Item_30_Day_Performance ≥ Settings::Inventory_Average ; "Average" ; 

Item_30_Day_Performance ≥ Settings::Inventory_SO_SO ; "So So" ;

Item_30_Day_Performance ≥ Settings::Inventory_DC ; "DC" ;

""

)

If range is 20 for average and 100 for HOT would the calc return HOT or Average if Item_30_Day_Performance is 150?

I am understanding it would return HOT as that would be the first true statement in the calc?

I cannot know if there will be more hot then average as the number will depend on how many sales were made in 30 days and that depends on customers, selling price and so forth. That is why I included _II to make sure correct value is returned.

I will try the LET way and see if that helps.

I did however find a way to speed it up by adding one more calculation which simply equals Item_30_Day_Performance. I used that field on list layout and the whole thing got a bit faster but not much.

I was also thinking along the lines of switching my date calculations into Items table rather then having them in Line_Items but are not sure if that will help.

Will test it and will post the results.

Posted

Usually rages are made continuous, so if the bottom limit for hot is 100, then something with 99 will be whatever the next case line returns. In this instance, average is >=20 so anything that is 99 will be average.

"I did however find a way to speed it up by adding one more calculation which simply equals Item_30_Day_Performance. I used that field on list layout and the whole thing got a bit faster but not much."

I'm not sure why that would have made a difference. Can you post the modified function?

Posted

Sure, here it is. I have been testing it further after seeing your reply and it probably was just my head but it did appear faster which got me. I do not know how to do a benchmark in this case but the more I think about it, it appears to be the same.

The top navigation does not work in the file. I deleted all "I cannot show" information from the file but have left real data important to this case so you will be able to use the file and see what i see. I always test my files with real data that the file is made for.

It should open in inventory layout then just use the layout menu to go to Inventory_List layout and you will see the speed issue I am referring to. Try scrolling...

One more quick question, if you don't mind, when you look at the file.

On customer layout on Items tab there is a portal displaying what items customer purchased. How would I get a summary in that portal for a total of one item customer purchase across all orders?

Thanks

Inventory.zip

Posted

I agree with Vaughan regarding replacing the repeated call to Item_30_Day_Performance by a variable. Also, the intermediate calculation fields in the Line_Items table are not required - you can calculate Performance directly from Date, Date_Constant, Get (CurrentDate) and Item_Quantity. And Sum (Item_Quantity) is exactly the same thing as Item_Quantity.

It's difficult to judge the impact of any of these without testing. Keep in mind that any unstored calculation is going to be slower in List/Table view. Filemaker calculates "as needed" and it has to evaluate the results for all visible records every time the window is refreshed.

Posted

"I have been testing it further after seeing your reply and it probably was just my head but it did appear faster which got me."

The *only* reliable way to test performance is with a stopwatch. Most mobile phones have a timer function, it doesn't need to have 1,000th second resolution.

Don't try to estimate time elapsed, it's too subjective. Write the timed results down, and repeat the tests a couple of times because FMP caches data. It can be worth while quitting FMP and re-opening the file to clear the caches, just to see what the first-time performance is like.

Posted

Thanks, tested it out and they appear same. I took the

Let(

performance = Item_30_Day_Performance ;

Case (

performance ≥ Settings::Inventory_Hot ; "HOT" ;

performance ≥ Settings::Inventory_Average and Item_30_Day_Performance ≤ Settings::Inventory_Average_II ; "Average" ;

performance ≥ Settings::Inventory_SO_SO and Item_30_Day_Performance ≤ Settings::Inventory_SO_SO_II ; "So So" ;

performance ≥ Settings::Inventory_DC and Item_30_Day_Performance ≤ Settings::Inventory_DC_II ; "DC" ;

""

)

)

and added it to a custom function. Made a new field a calculation and made it = Performance. This sped things up quite a bit. Speed gain is adequate (workable with).

Not sure if that is the fix you suggested (doing LET as custom function instead of just putting it into a calculation) but it worked nicely.

Thanks.,

Posted

Thanks, figured as much for SUM (Item_Quantity). Thanks for the tip about the dates. I implemented the changes and all appears to be fine now.

Posted

I believe this should be quite sufficient:


Let (

performance = Item_30_Day_Performance 

;

Case ( 

performance ≥ Settings::Inventory_Hot ; "HOT" ;

performance ≥ Settings::Inventory_Average ; "Average" ;

performance ≥ Settings::Inventory_SO_SO ; "So So" ;

"DC"

)

)

As Vaughan already explained, it's not necessary to test - or even keep - the upper limits, assuming your ranges are contiguous.

I am not sure using a custom function helps in speeding things up.

Another thing worth testing might be loading the range boundaries into global $$variables, to avoid the repetetative call to related fields.

This topic is 5940 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.