Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 1646 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

Hi Guys, 

So I am CLOSE to my near final question, until it comes to reports.....yay!

This is possibly opening a can of worms, but I want to create a Date Calculation. 

The table will look something as follows: 

Type  Buy/Sell Date
Banana  7 10 April 2020
Apple 10 5 April 2020
Apple -3 9 April 2020
Bananna 4 17 April 2020
Apple 9 14 April 2020
Bananna -4 20 April 2020
Apple 16 30 April 2020

I want to run a calculation to say "How many Apples did I have on Date X?".  The date will come a TimeStamp on another table. 

The correct answer is of course 32.   

 The same formula for bananas would be 7. 

So really my questions is, what would I need to go about making a calculation like this? 

Once this is mastered, I think/hope to have no more questions other than on Reporting!

Thanks!

Neil

 

 

 

 

Posted
4 minutes ago, Neil Scrivener said:

This is possibly opening a can of worms

Possibly.

it would be easy to define a relationship from a table of Types to an occurrence of the table shown (let's call it Transactions) as:

Types::Type = Transactions 2::Type
AND
Types::gCutOffDate ≥ Transactions 2::Date

and then just sum the related transaction quantities.

It would even easier to find the transactions that occurred before the cut-off date, sort them by Type, and show them in a summary report.

However, as time goes by, the number of transactions will increase and these methods may become too slow. And that's a can of worms I don't feel like opening right now.


 

Posted

In a script you can do an executeSQL() to find all the records where type=banana and use the date as a WHERE clause, SELECTing only the buy/sell field.  That will give you a list of values that you can loop through to get the final number.

If you want that for all products then it's a loop within a loop by first asking for all DISTINCT types in your time frame and then asking for their values in turn.

You can display the results in a virtual list table or a JavaScript DataTable.

If you want to have this report available quickly I would suggest you run these kinds of aggregations on a schedule and store the results in their own reporting tables.  This is typically not the kind of stuff you want to have collected at the moment you run the report.

Posted
1 hour ago, Wim Decorte said:

That will give you a list of values that you can loop through to get the final number.

If you're going to use SQL (which I would be reluctant to do), why wouldn't you use it to also aggregate the values??

 

Posted

It's one of the things to test.  eSQL() is blazingly fast when used under the right circumstances.  Using complex joins or using SQL functions is usually where it starts to slow down and I tend to prefer to do the basic queries with it, then post-process the data in FM.  But not always, sometimes using SQL functions inside the query are just fine, much depends on the amount of data and the underlying FM schema.

 

Posted

Thanks for the info guys. 

If i was comparing for the month of April, would I not need to have a 'start' date' too? 

 

I've put a quick demo file of what I'm looking to achieve. 

So if you look on the Companies Layout, I'm looking at the Owned field in the Valuation Portal to be be a calculation at that snapshot in time. 

So where a Valuation is on 19 May 2020, I want it to tell me at that point how many items I 'own' based on the Records under the 'Previous Data of Transactions' field. If that makes sense? 

Demo File+.fmp12

Posted

What I'm in essence trying to do is work out a snapshot of how much the items were worth per month. Which will tell me if it has lost or made money. 

So the eventual calculation will tell me: 

- How many units did I own in that month?

- What was the total cost of those units in that particular month?

- What is the value of those units in a month according to the particular month in the Valuation portal?

- What for that month is the outcome, both £ and % wise. 

Hopefully that explains better. 

 

Posted

We get the business use case of the questions; they're valid questions that fall in the general realm of "business intelligence".

Take a page out of that BI book and how it works with things like data warehousing, the core concepts of ETL: extract, transform, load.

It speaks to the difference between your data as generated and used for the purely transactional business workflows and the data as used for reporting and gathering insights.  Very often that means extracting and transforming your data (usually pre-aggregating, flattening, de-normalizing,...) so that the BI tools can use that data and not the original data for speedy and accurate reports.

My point being: be somewhat weary of writing calculations that have to run on the basic data over and over again.  Build in that step that will collect and pre-aggregate the data for your reports.

 

Posted

I'm sure there are a good number of general courses on BI available on sites like Udemy and so on.  Beyond the core concepts, the learning curve is in getting to know the BI tool of your choosing.  Some of the decision around whether or not to use a dedicated BI tool is 'make or buy' - how much does it cost to create the functionality in FM vs. buying a tool and learning it.

Posted

Great, thanks. One final question on this. 

Is there a temporary fix I can use in the way of a calculation to resolve this particular issue? 

Totally understand the arguments on speed and efficiency - but also mindful that I want this database to 'work', and would rather have a beta with a slower calculation whilst I learn the BI and Scripting - than delay getting a beta version together whilst I learn something else, if that makes sense? 

Thanks!

N

Posted

This calc will get you the list as you can see from the result of the data viewer.

But you'd still have to parse it out and present the user with something that looks better.  Unless you want the calc to also include the type you're after instead of the types found in the chosen time frame.

As to delivering a system 'that works' but is suboptimal; more often than not I see systems where that choice is made but never rectified afterwards.  The term for it is 'technical debt'.  After you deliver it will become much harder to convince the client to let you undo what you did to replace it with something they think they already paid for.  Better to do right the first time around and use it to learn the required scripting and architecture to avoid performance issues down the road.

2020-05-18_14-14-17.png

Posted

You have a bunch of typos in that calc: mostly lack of spaces:

- after the SELECT

- before the FROM and WHERE and GROUP BY

- after the GROUP BY

remove the space after the SUM and before the (

Posted

Makes sense thank you. So with the script working, I guess I need to do a lot of learning on Scripting. 

So I've taken this Script added as a Custom Function. Easy. 

I've inserted it as a Field. Easy. But I'm getting the result below. 

Is there a straightforward way I can manipulate this to show me just the Number that is related to record only. For example in the first record Apple is the 'Type', so I'd like Items in April to show just 16. 

Also, if I change the date, I've noticed the quantities don't update. Do I need to add a Refresh command to refresh the calculation when a Date or Quantity is changed?  

 

 

 

]

Screen Shot 2020-05-19 at 6.56.54 PM.png

Posted

There may be a confusion of terms because there is no script involved at this point... and to re-iterate: you should be scripting this kind of report, not using calculated fields.  Otherwise it *will* come back and bite you.

If you have never seen a penalty in action then download the SQL performance test from my old devcon session: https://www.soliantconsulting.com/blog/executesql-named-buckets/

It shows how using executeSQL with an open record in the target table can lead to serious performance issues.  And with calculated fields that need to be unstored to work: you have no control over when the calculation fires so you may find yourself in exactly this situation.  And this is just one example, just one that is extremely easy to demonstrate.

As to your questions:

- to make it work for just the type of the record you are on, add " AND " & _type & " = ?" to the WHERE clause and add a parameter to the executeSQL() at the end to give it the value of the type field

- to make it refresh automatically, make the calculation unstored --> and this is where the performance problems will begin, you may not feel it now because you don't have many records, but this is how it starts.

  • Like 1
Posted

Thanks, I should make it clear that this is a completely new territory for me. I really don't know anything about anything I'm writing here other than what you're telling me. So I am taking everything extremely literally while I learn this syntax. 

I simply don't understand the SQL side, I'm happy to learn, but it's a slow process. 

Turning to the calculations, I've made the following changes: 

Quote

"WHERE" & _matchfield & " >= ? AND " & _matchfield & " <= ? " AND " & _type & "=?" &

and

Quote

ExecuteSQL( _sql ; "" ; "" ; "4/1/2020" ; "4/30/2020" ; &_type )

I'm getting Table Missing. 

Should I delete the "GROUP BY" command? 

Posted

No, don't delete the GROUP BY, the GROUP BY is responsible for summarizing the data so that each type is listed only once, it's how sql knows how to sum the values.

Your sql syntax looks correct but not your executeSQL()

as the last parameter, use a reference to the field itself because you need the field's value.  Using my demo file that would make it:

 

ExecuteSQL( _sql ; "" ; "" ; "4/1/2020" ; "4/30/2020" ; demo::type )

Posted

Unfortunately no cigar: 

Quote

Let(

[

_type = _FIELDREF_getField ( demo::Type ; True ) ;

_qty = _FIELDREF_getField ( demo::Buy_Sell ; True ) ;

_table = _FIELDREF_getTO ( demo::Type ; True ) ;

_matchfield = _FIELDREF_getField ( demo::Date ; True ) ;

 

_sql =

"SELECT " & _type & ", Sum(" & _qty & ")" &

"FROM" & _table &

"WHERE" & _matchfield & " >= ? AND " & _matchfield & " <= ? " AND " & _type & "=?" &

"GROUP BY" & _type ];

executeSQL( _sql ; "" ; "" ; "4/1/2020" ; "4/30/2020" ; demo::type )

)

This seems to be exactly as per your directions, unless I've done a typo. 😭

This topic is 1646 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.