aholtzapfel Posted August 23, 2007 Posted August 23, 2007 I have a table that has 1,230,000 records in it (avg 8-10 thousand new records a day). It's a fairly simple table but tracks all phone calls from our marketing department(date/time of call, result, brief note, ..) It works fine but, as files get this large it begins to get difficult to do anything with them except run them on the server. (I've also got a National do not call Table with 3.5 millon recs). Does anyone have any experience dealing with tables this large? Are there any "good" ways of dealing with them? Should I be thinking about getting an SQL server to handle these tables? Can I just "juice up" a client with alot of ram for when I need to do anything to it? My other option is to archive old records but I would prefer not to in order to allow reporting of call volumes year to year.(that and the companys is growing, first year of this system they made about 88,000 calls(1 record each), this year they have already made 720,000)
Vaughan Posted August 24, 2007 Posted August 24, 2007 "... it begins to get difficult to do anything with them except run them on the server." How is the file being shared with the users: on a file server or hosted with FM Server?
aholtzapfel Posted August 24, 2007 Author Posted August 24, 2007 I'm hosting the files on Filemaker server 8, It runs on it's own server (win 2003) that has 1 gig of ram. Everything works fine on the server, and if I take them off the server and copy them to a client (not servered) they open and will run. the problem I have is doing anything else (I gotten a version 9 clients, and have had to save a couple of my files as compressed copies in order to open copies of them, thankfully the Call Log file I mentioned did not seem to have any corruption) I did try to "Save a compressed copy as.." command on my calllog file but it locked up my system. (of course this was NOT the only a copy of the file.) And that raised the question in my head "If anything were to happen to them what options do I have to Stablize them and get the back up. I always have good back-ups but with as many as 12,000 records being created a day, I'd hate to lose even an hour. Approxamatly 30 users NEED the info in this table for almost everything they do and I try to always have a good plan in place so even in a worst case scenario I can have them back up and running with-in 30 minutes. However, as it sits now If anything were to happen to this file, the only good option I have is to restore a back-up. I have serious doubts (due to the size of this table) that I could even recover the data from it if it ever became damaged. Any suggestions?
Fenton Posted August 24, 2007 Posted August 24, 2007 You can still count things after archiving. Create a Qty field, auto-enter 1. Then, if you archive 1,000,000 records, create 1 record with the date(s) and set the Qty to 1,000,000. Count the Qty, not the records. Seems kind of silly, but the speed gain would be worth it.
aholtzapfel Posted August 24, 2007 Author Posted August 24, 2007 Hey, that's a good idea, I never even considered "archiving" leaving a "Record" to hold the values of many. I suppose accountants do this every year with bookwork. Not sure if this would do everything I want it to but is certainly worth looking at. This is not somthing I would have thought of on my own (hence the post) thank you. I'd still love to hear if anyone else has any other suggestions. (with the company I work for growing at this rate, I think this is only the begining of these types of problems for me.)
Fenton Posted August 24, 2007 Posted August 24, 2007 Yes, I was forced to do this on a large table. It is even possible to summarize data before archiving, if necessary. Say you have different categories of calls, and you still want to report on the categories, clear back through time (though in this case that doesn't seem likely). You can create one record for each category, with a summarized quantity. The archived records are all still there, just in another table. So, if necessary, they can be made available in all their multitudes. It just needs a little interface work to facilitate that. Nothing really is lost.
Vaughan Posted August 24, 2007 Posted August 24, 2007 Thanks Fenton, that's a technique worth remembering.
Steven H. Blackwell Posted August 25, 2007 Posted August 25, 2007 I'm hosting the files on Filemaker server 8, It runs on it's own server (win 2003) that has 1 gig of ram. Way below spec. Try it with 4 GB and kick up the cache. Steven
aholtzapfel Posted August 28, 2007 Author Posted August 28, 2007 Ok, I've taken your advice, I have 2 more gig of ram for my server. (My server will only reconize upto 3.5 gig). However, after rereading practices for the server, I might have too large of a cache size. My cache hit % is consistantly 100%. My cache flush is set to 1 minute (I have tried longer but my system has so much activity that anything longer causes problem.)Should I really "Kick up the cache size"? or should I reduce it? (make avg hits 95-98%)
Recommended Posts
This topic is 6299 days old. Please don't post here. Open a new topic instead.
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now