Jump to content

Aggregate Confessions


This topic is 7114 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Waiting for layouts to calculate before displaying results, the dread of switching to sub-summary reports, and slow layout switches leaves me cold. Here is my sin ...

I sometimes write totals to records during the night so staff can have their instant displays. crazy.gif

You see, our company wants it all ... every summary by Contact by Product by Sales Rep in multi-column portals by day, week, month, year ... on the fly - any time they wish, filtered as they wish. And they want these summaries instantly on form layout. And I want to give it to them. I don't mind Aggregate functions on smaller datasets but my 400,000 lineitems is a bugger.

Am I the only one that does this? I dislike the thought of unnecessary data but I very-much hate Users having to wait even a nanosecond. I'm confessing this because, well, ... I want to do more of it. blush.gif

And I want to create those records by setting the related fields (with Allow Creation of Related on). Currently, I'm exporting report summaries and then importing back into the summary table. FM's restriction on importing from one table to another is why. I want to script a process to write these summaries to multilines (which I can then parse into records) or set summary fields directly in another table (and create the new records) and eliminate the export/import issue entirely. I know how to take multiline piped arrays and split them into records/fields but I would prefer to set related directly.

So, two real questions: 1) How best can I handle this process, ie, grab summaries and create related records? 2) Should I do it? Or am I nuts for even considering it?

Ideas would be most welcome. Oh, I feel so much better coming clean. wink.gif

LaRetta

Link to comment
Share on other sites

I put Aggregate fields directly on the regular layouts; no need to switch to a sub-summary report layout. But with 400,000 items...maybe you just need faster computers...or maybe you need to archive some of those 400,000 items.

Link to comment
Share on other sites

Hi LaRetta,

I think your scheme to store your aggregate results is a good idea, just so long as your users know what time the data was compiled.

I have a couple thoughts on how to do this. First, I like the idea of adding a record in an Aggregate table to represent the aggregate value of a set of data, grouped in a particular way. This Aggregate table could be used to hold a variety of Totals, Averages, and Counts, grouped in a variety of ways. I think this could be done by keeping track of each of those options in fields in this Aggregate table. You could then interface with this table through filtered portals to show results for a chosen Table, Summary, and Group option.

One trick that may be useful for you here, now that you're moving away from the summary export, is quickly looping through the found set of data to get the counts and groups. I use this method for gathering data for my charting module. First, lets define a couple fields:

gCurrentRec (number, global)

Count of Records (summary) = Count of RecordID

SalesRep (text)

Count by SalesRep (calculation, number result) = GetSummary(Count of RecordID; SalesRep)

<plus whatever other summary fields you need as aggregate functions>

Now a script to skip through the found set:


Sort [ by SalesRep ]

Set Field [ gCurrentRec ; 1 ]

Go to Record/Request/Page [ First ]

Loop

   <Copy aggregate value and SalesRep to another table here>

   Set Field [ gCurrentRec ; gCurrentRec + Count by Sales Rep ]

   Exit loop If [ gCurrentRec > Get(FoundCount) ]

   Go to Record/Request/Page [ gCurrentRec ]

End Loop

This just looks at one record of each group to get a SalesRep to go along with the aggregate.

Anyway, that's how I'd go about this.

Link to comment
Share on other sites

I do not believe that re-calculating aggregate totals over large volumes of static data on a daily basis is a good use of resources, transpower - regarless of computer speed. Placing an aggregate field on a form layout that is constantly accessed isn't very pleasant for staff. As I read recently here on forums and liked very much (from Jerry, I believe) ... just because FileMaker CAN do something, that doesn't mean it SHOULD do it. Constantly recalculating the same summaries over and over is (in my mind) one of these examples.

And any pause, if even for 4 seconds, that an employee can not work waiting for redraws and recalculations will (over time) add up to substantial dollars lost for our company. I will work to shave off every second every chance I get. Archiving data from only a few years ago is not acceptable. We use this data to model and project customer sales and product buying patterns.

Thank you so much for responding, Mike. smile.gif

"just so long as your users know what time the data was compiled. "

What I usually do is display the Aggregate Summary record (which also displays the update day/time above it). And, next to that, that day's running total displays as unstored. In this way, they have up-to-date correct totals but processor stress is kept to minimum.

"This Aggregate table could be used to hold a variety of Totals, Averages, and Counts, grouped in a variety of ways"

Whew! I felt nuts for considering this but you are picking up my thinking EXACTLY, Mike!! By using fields in the Aggregate table, I can specify all sorting/filtering criteria necessary to produce on-the-fly portal displays and totals, based upon various options. With proper consideration, this would allow multiple filtering and would produce instant displays - no hour-glass pauses while system calculates; no portal re-draw slowdowns. A loop ... beautiful! I'll let you know if I get stuck but I think I'm ready to rock 'n roll on it! wink.gif

LaRetta

Link to comment
Share on other sites

Another thought on this same train (maybe you were heading this direction)??? If you calculate Aggregates by month or quarter, then just keep the past totals in the Aggregates table, and resummarize the current ones, this would also reduce the amount of recalculating. You could build in a mechanism to flag a past Aggregate set if it should be recalculated in the next run (or allow a set to be recalculated on demand.) The filtered portal could show the records for each month or year, and total the results in the parent table (or interface table.) This type of calculation shouldn't be too taxing (though Average could be tricky.)

Link to comment
Share on other sites

Loretta, the layouts that the data entry staff see don't have to have the summary fields, just the layouts for the executives. Today "dashboard programming" is quite common--put real-time accounting data right there for the CEO to see at any instant.

Link to comment
Share on other sites

"Today "dashboard programming" is quite common--put real-time accounting data right there for the CEO to see at any instant. "

I understand that, transpower, but our SALES STAFF needs this data at their fingertips 100 times throughout the day. They do not data-enter ... they sell. This is not an option for us.

Link to comment
Share on other sites

Hmm...Reminds me of something Ugo described a while back.

Instead of calculating the aggregate values with Sum and Count at the time they're needed, use normal number fields for Sum and Count in an intermediate table, and update the numbers every time a transaction is entered or deleted (via scripts.) This puts a little more work on the process scripts, but the summaries are then available with little calculation work needed.

Link to comment
Share on other sites

Hi Mike, Hi LaRetta,

That's actually how all my stats are stored, in repeating fields, and with v6 they were for sure quicker than any relationship-based calcs.

What I mainly liked with these repeating fields (still with v6), is that you could split them in separate records at wish in some other files at the fly.

Also, these stats keep being accessible whatever the changes made to the database structure, and this is most appreciated now with this v7 update I'm doing, where I'd just start the new database from Point 0 with all my previous stats stored.

What you're describing Mike is the starting process. It's a "Transactional" route where you'd substitute an Unstored calc with a plain Index number. Then what you need is a way to move this number "up" into the repeating field blocks, say DailyStatRep, WeeklyStatRep, MonthlyStatRep, QuarterStatRep, YearlyStatRep repeats.

Hmm...now I'm not sure wether I'd use a separate Stats table or my good old repeating fields. I didn't get that far in the migration at the moment, but my guess is that I'd keep repeating fields for a while, just because FM7 is so new that there are chances that my newly created application structure would change.

...And...

with FM7 new repeating features, you could just use your label tip LaRetta to get the correct rep in place, through a repeating calculation.

HTH

Link to comment
Share on other sites

Hey Ugo! I admit I'm torn on whether to use repeating fields here or not. wink.gif Just because this is summary data, it is data none-the-less; and my mind flashes on a report sub-summary of the summary data and we know how repeating fields are when reporting. But my aprehension may mostly stem from my inexperience with using repeating fields, unlike your background with them. crazy.gif

I have so many filtered relationships and globals in place now, that using records (instead of reps) just seems more consistent (using the same globals attached to the same value lists) without needing to script splitting the reps (when needed) as records etc. I will consider this carefully however. Thank you for responding. smile.gif

The ability to point and drill-down to the data will also be important and GTRRs based upon the same consistent logic (globals, value lists and date-ranges) just seems more cohesive. I envision pointing to a summary portal-row columnar record (in which the criteria is also stored in a field) and that criteria will establish the exact filtered relationship necessary to display the drill-down data in the detail portal.

"... use normal number fields for Sum and Count in an intermediate table, and update the numbers every time a transaction is entered or deleted (via scripts.)"

Yes. I have SecureFM and I'm seriously considering this when our invoicing piece is attached (I'm still importing from accounting). And you know how imported data doesn't respond well to event triggers. tongue.gif I will rely heavily upon event triggers to update inventory counts instantly. I drug my feet on invoicing until SecureFM for 7 was released; and now our sales-analysis needs are pushing this piece further back still. And I believe that, using records instead of repeating fields, will blend and transition well when I implement the event triggers which are based upon auto-enter calcs on fields. Again, I admit my bias against repeating fields for data.

Mike, combining the past aggregate records with the unstored daily current entries is the way I'm testing it now and even though unstored, limiting the relationship to that day plus the aggregate field still responds instantly in my test tables. I've felt like a traitor to relational theory but constantly counting and summing posted lineitem data over and over just seemed wrong to me. I really appreciate your suggestions; I feel I'm ready to move forward with these decisions; and I feel much better about my Aggregate sins. wink.gif

LaRetta

Link to comment
Share on other sites

Hi LaRetta,

Getting back to the first lines of your post, as I was somehow [OT] with my first attempt and answer to Mike.

You see, our company wants it all ... every summary by Contact by Product by Sales Rep in multi-column portals by day, week, month, year ... on the fly - any time they wish, filtered as they wish. And they want these summaries instantly on form layout. And I want to give it to them. I don't mind Aggregate functions on smaller datasets but my 400,000 lineitems is a bugger.

IMO, whatever the foundset size viewed from these portals, would they actually be filtered or not, the solution will remain solid and speedy unless ???

- You're using unstored Aggregate calculation from these portals records.

- The fields displayed in the portals are "aliases" using GetField( )

- The Sorting key(s) are unstored.

You may need at least one of the two last "tools" to cover your needs for such a User Friendly choices. But definitely scratch any unstored Aggregate calculation from the map.

Note that your filter keys may be kept in either repeating fields or separate records, this wouldn't make a big difference in term of speed. I'd go for separate records too, mainly because of the Customization needs.

This seems in fact very close to a technique designed for Finds I've used with v5, where each critera was stored as a unique record in a FindCriterias.fp5, related through a SelfJoin to a RequestID, and to a FindID in Finds.fp5.

Each Find was named and accessible from anywhere in the solution through a portal, each line triggering a quick execution of the find.

Along building any custom finds, users could decide :

- whether the result should be stored or updated daily

- whether the find should be saved as one of its Private Top10 finds

- how the result would be displayed and sorted

In your case and Relationships rather than Find scripts, you could store the "SetUp" with these basic fields :

- SetUpID

- SetUpName

- Filtered Key (txt)

- ResultType (txt) : Sum, Count,...

- ResultNum (num)

Result Num would be dragged to your other Table Form Layouts as a related field, and be updated by script whenever its value doesn't equal the current Relational result (test it through a sub-script, using a relationship from the SetUpTable to the LineItems)

Therefore, ResultNum, still being unstored, doesn't slow down things as it isn't an unstored calculation anymore. An even better approach, which I'm fond of is the new Lookup( ) calculation if you need these temporary values to be somehow a bit more record-static.

You could classify the SetUps availables, or restrict their access to certain users, etc. and of course users could build a new SetUp or pick one from the available list through a portal in any new window, which will in fact trigger the filtering process.

_______

As for the Repeating Field logic I described earlier, it was more adapted for "Static" data flaws, compared to the "Instant Custom" filters you're looking at.

Though, to rebound on the topic, and after a day of reflexion, I will definitely keep the repeating field logic even with v7.

HTH

Link to comment
Share on other sites

This topic is 7114 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.