Cassetti Posted December 11, 2006 Posted December 11, 2006 (edited) Ok, this is a bit strange. We have a filemaker pro server with several databases. Recently several databases have been crashing. I found out that one of the databases is 500 Megs in size! This is a database that we rarely use, we use it only for tracking our quotes. We can't figure out why it is so huge Is there any way to check out the size, or to look into everything. Our average size is around 30 to 100 megs. There is only 1 container field. There are 900 records, and half of them have a pdf in the container field. the other records have nothing in the container. Largest size of these pdf's is 18kB - so there is no reason why it should be so huge. thanks! Edited December 11, 2006 by Guest
Fitch Posted December 11, 2006 Posted December 11, 2006 If you use Insert... rather than Paste into your container fields, the file won't get so huge.
Cassetti Posted December 11, 2006 Author Posted December 11, 2006 the odd part is that we are using insert. I have a field called "external filename" - you type in the filename of the quote (thats what this database is for) and click attach it goes to that folder and inserts that file into the container.
xoomaster Posted December 12, 2006 Posted December 12, 2006 try to optimize the file using the file maintanance option. also check the integrity of the DB and run a recovery see if it detects any errors ? R/O any potential relationship errors causing addition of data inappropriately ! Xoomaster
Genx Posted December 12, 2006 Posted December 12, 2006 ... That's why i never store anything in my container fields... i just use calcs that reference stored filepaths on a shared drive somewhere, makes life a lot easier -- especially with backups.
Cassetti Posted December 12, 2006 Author Posted December 12, 2006 Well, first backups are no problem, our databases are stored on a folder on our NetApp data server. It kicks of a snapshot every hour on that folder in th event any problem is detected we can jump back x number of hours in time! The thing here is that this has been going on for months and i never found out about it. Also we have another database with our service contracts. Each record has a pdf of the contract attached to it. This database has 4 or 5 tables, and much more relationships and pdf attachments that are 2 or 3 times the size of the average attachment of the database in question. And yet the database with service contracts is only 99 megs in size. Xoomaster, where is this optimize option at?
Ender Posted December 12, 2006 Posted December 12, 2006 Well, first backups are no problem, our databases are stored on a folder on our NetApp data server. It kicks of a snapshot every hour on that folder in th event any problem is detected we can jump back x number of hours in time! Copies of open files made in this manner are not safe to work with. Use Server's built-in backup scheduling to make a local backup, then take snapshots of those with your backup software.
Cassetti Posted December 12, 2006 Author Posted December 12, 2006 thats probably what he's doing. I'm the developer, I'm the one who makes the scripts and layouts. Our network engineer takes care of everything technical
Cassetti Posted December 12, 2006 Author Posted December 12, 2006 Yup - i don't administrate it - i just let him know when i need the password for big changes, its unfortunate but its due to security reasons that he won't give me full rights to it
xoomaster Posted December 16, 2006 Posted December 16, 2006 The opimize option is under the tools / File maintanence sub-menu ( if using FMP 8/8.5 ). Try also recovery once, sometimes there is a bad field definition or corruption.
Recommended Posts
This topic is 6614 days old. Please don't post here. Open a new topic instead.
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now