Jump to content

Recommended Posts

Hi Guys, 

So I am CLOSE to my near final question, until it comes to reports.....yay!

This is possibly opening a can of worms, but I want to create a Date Calculation. 

The table will look something as follows: 

Type  Buy/Sell Date
Banana  7 10 April 2020
Apple 10 5 April 2020
Apple -3 9 April 2020
Bananna 4 17 April 2020
Apple 9 14 April 2020
Bananna -4 20 April 2020
Apple 16 30 April 2020

I want to run a calculation to say "How many Apples did I have on Date X?".  The date will come a TimeStamp on another table. 

The correct answer is of course 32.   

 The same formula for bananas would be 7. 

So really my questions is, what would I need to go about making a calculation like this? 

Once this is mastered, I think/hope to have no more questions other than on Reporting!







Link to post
Share on other sites
4 minutes ago, Neil Scrivener said:

This is possibly opening a can of worms


it would be easy to define a relationship from a table of Types to an occurrence of the table shown (let's call it Transactions) as:

Types::Type = Transactions 2::Type
Types::gCutOffDate ≥ Transactions 2::Date

and then just sum the related transaction quantities.

It would even easier to find the transactions that occurred before the cut-off date, sort them by Type, and show them in a summary report.

However, as time goes by, the number of transactions will increase and these methods may become too slow. And that's a can of worms I don't feel like opening right now.


Link to post
Share on other sites

In a script you can do an executeSQL() to find all the records where type=banana and use the date as a WHERE clause, SELECTing only the buy/sell field.  That will give you a list of values that you can loop through to get the final number.

If you want that for all products then it's a loop within a loop by first asking for all DISTINCT types in your time frame and then asking for their values in turn.

You can display the results in a virtual list table or a JavaScript DataTable.

If you want to have this report available quickly I would suggest you run these kinds of aggregations on a schedule and store the results in their own reporting tables.  This is typically not the kind of stuff you want to have collected at the moment you run the report.

Link to post
Share on other sites
1 hour ago, Wim Decorte said:

That will give you a list of values that you can loop through to get the final number.

If you're going to use SQL (which I would be reluctant to do), why wouldn't you use it to also aggregate the values??


Link to post
Share on other sites

It's one of the things to test.  eSQL() is blazingly fast when used under the right circumstances.  Using complex joins or using SQL functions is usually where it starts to slow down and I tend to prefer to do the basic queries with it, then post-process the data in FM.  But not always, sometimes using SQL functions inside the query are just fine, much depends on the amount of data and the underlying FM schema.


Link to post
Share on other sites

Thanks for the info guys. 

If i was comparing for the month of April, would I not need to have a 'start' date' too? 


I've put a quick demo file of what I'm looking to achieve. 

So if you look on the Companies Layout, I'm looking at the Owned field in the Valuation Portal to be be a calculation at that snapshot in time. 

So where a Valuation is on 19 May 2020, I want it to tell me at that point how many items I 'own' based on the Records under the 'Previous Data of Transactions' field. If that makes sense? 

Demo File+.fmp12

Link to post
Share on other sites

What I'm in essence trying to do is work out a snapshot of how much the items were worth per month. Which will tell me if it has lost or made money. 

So the eventual calculation will tell me: 

- How many units did I own in that month?

- What was the total cost of those units in that particular month?

- What is the value of those units in a month according to the particular month in the Valuation portal?

- What for that month is the outcome, both £ and % wise. 

Hopefully that explains better. 


Link to post
Share on other sites

We get the business use case of the questions; they're valid questions that fall in the general realm of "business intelligence".

Take a page out of that BI book and how it works with things like data warehousing, the core concepts of ETL: extract, transform, load.

It speaks to the difference between your data as generated and used for the purely transactional business workflows and the data as used for reporting and gathering insights.  Very often that means extracting and transforming your data (usually pre-aggregating, flattening, de-normalizing,...) so that the BI tools can use that data and not the original data for speedy and accurate reports.

My point being: be somewhat weary of writing calculations that have to run on the basic data over and over again.  Build in that step that will collect and pre-aggregate the data for your reports.


Link to post
Share on other sites

I'm sure there are a good number of general courses on BI available on sites like Udemy and so on.  Beyond the core concepts, the learning curve is in getting to know the BI tool of your choosing.  Some of the decision around whether or not to use a dedicated BI tool is 'make or buy' - how much does it cost to create the functionality in FM vs. buying a tool and learning it.

Link to post
Share on other sites

Great, thanks. One final question on this. 

Is there a temporary fix I can use in the way of a calculation to resolve this particular issue? 

Totally understand the arguments on speed and efficiency - but also mindful that I want this database to 'work', and would rather have a beta with a slower calculation whilst I learn the BI and Scripting - than delay getting a beta version together whilst I learn something else, if that makes sense? 



Link to post
Share on other sites

This calc will get you the list as you can see from the result of the data viewer.

But you'd still have to parse it out and present the user with something that looks better.  Unless you want the calc to also include the type you're after instead of the types found in the chosen time frame.

As to delivering a system 'that works' but is suboptimal; more often than not I see systems where that choice is made but never rectified afterwards.  The term for it is 'technical debt'.  After you deliver it will become much harder to convince the client to let you undo what you did to replace it with something they think they already paid for.  Better to do right the first time around and use it to learn the required scripting and architecture to avoid performance issues down the road.


Link to post
Share on other sites

You have a bunch of typos in that calc: mostly lack of spaces:

- after the SELECT

- before the FROM and WHERE and GROUP BY

- after the GROUP BY

remove the space after the SUM and before the (

Link to post
Share on other sites

Makes sense thank you. So with the script working, I guess I need to do a lot of learning on Scripting. 

So I've taken this Script added as a Custom Function. Easy. 

I've inserted it as a Field. Easy. But I'm getting the result below. 

Is there a straightforward way I can manipulate this to show me just the Number that is related to record only. For example in the first record Apple is the 'Type', so I'd like Items in April to show just 16. 

Also, if I change the date, I've noticed the quantities don't update. Do I need to add a Refresh command to refresh the calculation when a Date or Quantity is changed?  





Screen Shot 2020-05-19 at 6.56.54 PM.png

Link to post
Share on other sites

There may be a confusion of terms because there is no script involved at this point... and to re-iterate: you should be scripting this kind of report, not using calculated fields.  Otherwise it *will* come back and bite you.

If you have never seen a penalty in action then download the SQL performance test from my old devcon session: https://www.soliantconsulting.com/blog/executesql-named-buckets/

It shows how using executeSQL with an open record in the target table can lead to serious performance issues.  And with calculated fields that need to be unstored to work: you have no control over when the calculation fires so you may find yourself in exactly this situation.  And this is just one example, just one that is extremely easy to demonstrate.

As to your questions:

- to make it work for just the type of the record you are on, add " AND " & _type & " = ?" to the WHERE clause and add a parameter to the executeSQL() at the end to give it the value of the type field

- to make it refresh automatically, make the calculation unstored --> and this is where the performance problems will begin, you may not feel it now because you don't have many records, but this is how it starts.

  • Like 1
Link to post
Share on other sites

Thanks, I should make it clear that this is a completely new territory for me. I really don't know anything about anything I'm writing here other than what you're telling me. So I am taking everything extremely literally while I learn this syntax. 

I simply don't understand the SQL side, I'm happy to learn, but it's a slow process. 

Turning to the calculations, I've made the following changes: 


"WHERE" & _matchfield & " >= ? AND " & _matchfield & " <= ? " AND " & _type & "=?" &



ExecuteSQL( _sql ; "" ; "" ; "4/1/2020" ; "4/30/2020" ; &_type )

I'm getting Table Missing. 

Should I delete the "GROUP BY" command? 

Link to post
Share on other sites

No, don't delete the GROUP BY, the GROUP BY is responsible for summarizing the data so that each type is listed only once, it's how sql knows how to sum the values.

Your sql syntax looks correct but not your executeSQL()

as the last parameter, use a reference to the field itself because you need the field's value.  Using my demo file that would make it:


ExecuteSQL( _sql ; "" ; "" ; "4/1/2020" ; "4/30/2020" ; demo::type )

Link to post
Share on other sites

Unfortunately no cigar: 




_type = _FIELDREF_getField ( demo::Type ; True ) ;

_qty = _FIELDREF_getField ( demo::Buy_Sell ; True ) ;

_table = _FIELDREF_getTO ( demo::Type ; True ) ;

_matchfield = _FIELDREF_getField ( demo::Date ; True ) ;


_sql =

"SELECT " & _type & ", Sum(" & _qty & ")" &

"FROM" & _table &

"WHERE" & _matchfield & " >= ? AND " & _matchfield & " <= ? " AND " & _type & "=?" &

"GROUP BY" & _type ];

executeSQL( _sql ; "" ; "" ; "4/1/2020" ; "4/30/2020" ; demo::type )


This seems to be exactly as per your directions, unless I've done a typo. 😭

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Similar Content

    • By dancer5678
      I am using Filemaker Server 18 on Windows Server 2012 R2
      Been using it for years with no issues
      Currently when I log in to the console it is very sluggish.
      When I get to the Dashboard it shows No databases, then it auto refreshes and the database list appears.
      Within 15 seconds of scrolling the database list to open files the screen refreshes. This situations is happening over and over in a loop.
      Any Thoughts on what is causing this issue?
    • By stevaroni
      I get an error 3 when using a script to Export Records via WebDirect. Using FileMaker Server 18 and have tried both Safari and Chrome both with same results. I have tried using the temporary path, desktop path, and documents path. I have tried using with the automatically open and not. I have tried writing a tab delimited and comma delimited file. Does anyone have ideas I haven't yet tried?
    • By cheerfulzebra
      Hi All
      I am hoping some one can help!
      I have a table called TPM, in that table is a field "Next Due Date" Which is auto generated by creating a new record. You input the date you did the service and then it adds 3 months to that date and that appears in the Next Due Date Field.. All working...
      What I am trying to do is when the database loads it runs a script to see if there are any outstanding TPM Services to do...
      Here is the script
      Show Customer Dialog
      Message: If ( TPM::TPM Next Due Date < Get (CurrentDate ) ; "You have outstanding TPM services" ; "You have no outstanding services")
      I thought it would be that simple. So I created a TPM Due date Less that Current date and ran the script...."You have outstanding TPM Services"...GREAT! Then I deleted that record and ran the script again...""You have outstanding TPM Services" Oh!!! That isnt true!
    • By Kitesurfer
      Can someone please explain why Leftwords function is not properly working when importing an excell sheet.
      For example the name John Doe LeftWords ( Client ; 1 ) should show "John" only.
      When typing John Doe it works fine but not when importing ????
      Looking forward to a reply.
      Kind regards
    • By Pio Soto
      My issue is that my customer layout has a script trigger on LayOutEnter that sorts my customers portal list by "Active" customers,
      but when I do a go to related record from another table say from quotes or any other the customer layout does not load that customer
      or show my related customer because the OnLayoutEnter script trigger changes it, I would like it to go to the related customer and also sort it by "Active"
      on the list.
      Hope someone can help me with this.

  • Create New...

Important Information

By using this site, you agree to our Terms of Use.