stimpy13 Posted August 12, 2005 Posted August 12, 2005 I have a fairly simple(famous last words) concept that seems to result in very slow results from filemaker. the database consists of one main table ( people) and several ( six) related tables (transactions). five of the related transactions table have "conditional " joins. For example the join condition might be PersonID=TpersonID and gTvalue > transaction_record_value. This allows me to have a calculation that SUMs the related records that meet a range of results ( for example the sum of records greater then 60 or 120 days) this all works fine except that searches on the sum calculation from the people table take forever due to the lack of indexing. so i set up fields to act as storage fields and copied the data from the calculation field to a standard number field ( which can be indexed). Now searches are fast. The problem arises when i need to populate the storage fields from the calculation fields. I have a simple script which loops thru each record in the people table copies the values from the five SUM calculations into the respective storage field. The script is a simple "set field A to contents of field B" five times and loop thu all records. And it works fine however is takes FOREVER (about 8 seconds per record ) to set five fields in a single record.Eventually i will check status and only update the record which have changes but a still need a version that will do the initial population. i thought that i could simply export from the SUM fields and import the results into the storage fields but the export speed is the same ( about 8 sec per record). Since i have 28000 people records this will take about 62 hours and this is way too long. I tried increasing the cache size to 3x the size of the DB hoping to have these operations occur in RAM but this had no effect on speed. This developemnt is on a 1.6 ghz G4 with 1.5 gigs of ram. Changing the hardware ( to faster CPU/OS) to might be one option but i was hoping someone here might be able to point out a quicker or better way. I am dubious that i could get hardware that will provide a 8x increase in speed. This solution will deploy on filemaker server with 5 - 10 netoworked users eventually and i expect a performance hit so IMHO the speed should be very fast on my dev machine. thanks stimpy
SlimJim Posted August 12, 2005 Posted August 12, 2005 It would probably be faster to do a Replace Field Contents by calculation than looping through all the records.The only problem is that you do not really want any body accessing the file while you are doing it. But then I guess that is also true of the looping method.
stimpy13 Posted August 12, 2005 Author Posted August 12, 2005 the current method uses the set field and calculation. I will try the replace method to see if that creates a time savings. As for the users i don't mind locking everybody out and "locking" all calculations if it cuts the time down. In practice this script might run once a week over a saturday night to catch any data issues .... the initial populations script perhaps once every six months if that .... thanks stimpy
Reed Posted August 12, 2005 Posted August 12, 2005 I still think this is going to be *really* slow with large record sets. If you have ~28,000 people (and presumably multiple transactions per person), your ranged relationships are searching many records on the fly to calculate each aggregate. Even moving from one record to the next with one of these calculated aggregate fields on the layout really bogs down when there are a lot of related records. You might be better off running a plain old subsummary report from the transactions table, where you can take advantage of using stored calculations which group the records into their groups (>60 or 120 days etc.). If you want to push those summaries into stored fields in the people table, you could use a GetSummary calculation to populate the appropriate fields.
Reed Posted August 12, 2005 Posted August 12, 2005 Here's a file I threw together for testing purposes (for what it's worth). There is a script which will create a whole bunch of records each with between 1-10 related records having random numerical values from 1-100 or so. The subsummary reports take a minute or so, with 120000 records, but it's much faster than accessing values with the ranged relationships large_test.fp7.zip
stimpy13 Posted August 13, 2005 Author Posted August 13, 2005 thanks for your response... i will take a look at the file. I think the problem with the solution u suggest may be that i need fields which have dynamiclly update fields showing the totals...reflecting recent updates. i suppose i could have a "post time" script which updates all these records. it seems the problems in general that filemaker has is that there is not way to tell filemaker to " stop calulating and store the results" ...even if just temporarly.. stimpy
Recommended Posts
This topic is 7099 days old. Please don't post here. Open a new topic instead.
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now