Jump to content

Child Records created per day - How many is too many?


James Gill
 Share

This topic is 4363 days old. Please don't post here. Open a new topic instead.

Recommended Posts

I'm working on a solution right now that is drawing some pretty complex relationships between TO's. It's a little hard to explain in a short post but put simply a parent record may have as many as 10-20 child records that creates a validation. What I've discovered is that in the course of a work week there may be as many as 100-150 parent records created. Of course, this now means that every week there will be 1,000-3,000 of these validations created as part of the parent record. My question is, how much is too much? After one year you're looking at between 52,000-156,000 records. Plus, that's only if volume remains static (which it's not projected to).

On the surface 156k records a year doesn't seem too bad but what can filemaker handle on a larger scale? I know I've seen some articles out there on how FMP handles 1,000,000+ records but does anyone have real world experience with huge datasets?

Link to comment
Share on other sites

I'm working on a solution right now that is drawing some pretty complex relationships between TO's. It's a little hard to explain in a short post but put simply a parent record may have as many as 10-20 child records that creates a validation. What I've discovered is that in the course of a work week there may be as many as 100-150 parent records created. Of course, this now means that every week there will be 1,000-3,000 of these validations created as part of the parent record. My question is, how much is too much? After one year you're looking at between 52,000-156,000 records. Plus, that's only if volume remains static (which it's not projected to).

On the surface 156k records a year doesn't seem too bad but what can filemaker handle on a larger scale? I know I've seen some articles out there on how FMP handles 1,000,000+ records but does anyone have real world experience with huge datasets?

I have one solution where there are attendance records used to create billing amounts in another table. What is happening is the billing amount entered is the sum of the child's attendance records over a one month period of time. The attendance records are fairly complex. We are generating approximately 12,000 attendance records per month (one record per attendance event per child). There are approximately 160,000 attendance records.

1. The program is accessed over a WAN

2. The program works fine day to day.

3. There is an automated billing routine that gathers these records and does the billing. This is bogging with the number of records we have in attendance where the billing is now taking 4+ hours to run. The increase in attendance records is slowing the generation of summed events.

I have had to put an attendance purge routine into the program (attendance records are not required after the billing is done) to bring this processing time back to normal.

So in answer to your question:

1. Yes FileMaker can handle a lot of data. Nothing is breaking.

2. There are weak spots

3. Keep the structure simple in that sums calculated fields etc. all put a load on the system. Our weakness only showed up during this particular report generation. It is one that was anticipated, but we had hoped not to have to deal with it.

HTH

Dave

Link to comment
Share on other sites

I'm working on a solution right now that is drawing some pretty complex relationships between TO's. It's a little hard to explain in a short post but put simply a parent record may have as many as 10-20 child records that creates a validation. What I've discovered is that in the course of a work week there may be as many as 100-150 parent records created. Of course, this now means that every week there will be 1,000-3,000 of these validations created as part of the parent record. My question is, how much is too much? After one year you're looking at between 52,000-156,000 records. Plus, that's only if volume remains static (which it's not projected to).

On the surface 156k records a year doesn't seem too bad but what can filemaker handle on a larger scale? I know I've seen some articles out there on how FMP handles 1,000,000+ records but does anyone have real world experience with huge datasets?

"10-20 child records that creates a validation."

I'm not sure what you mean by this phrase. As well, it seems to me, at first glance, that you're creating more records than is necessary . . . perhaps a design problem. Not that FMP can't handle millions of records . . . it can.

Link to comment
Share on other sites

This topic is 4363 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.