Jump to content
Server Maintenance This Week. ×

File continuously growing


This topic is 2913 days old. Please don't post here. Open a new topic instead.

Recommended Posts

  • Newbies

I have a .fm7 file hosted on FMS9. We have records that are added via the FM8.5 client as well as an incoming ODBC connection from a Tomcat webserver. After the records are processed in one department, they get moved into our accounting database (a seperate fm7 file on the same server) for billing purposes. The original records are removed from the first database. After accounting is done with the records they get moved into an archive file.

Everything works fine in the file, but it starts at 40MB and grows about 100MB-400MB every day. It's grown to about 30GB (yes, GB) after about a month.

If I save an empty clone and import the records into it, then it drops back to about 40MB with no data loss (except for indexes i suppose). I tried saving a compact file, doing a recovery, and all that fun stuff, but it stays huge until I import into a clean file.

I even tried taking the file and deleting everything in it. I had a file with no records, no tables, no scripts, no value lists, and no external references, but it was still over 10GB. Even doing a recovery on this empty file gave me a huge file.

The main file generally has around 14000 records in it at any given time, and the accounting file is similar but has no problems with ballooning file size, and is currently around 150mb with nearly 100k recrds. The archive file that all records eventually end up in is around 500mb and has around 300k records.

The only big difference i can think of between the accounting file and the main file is the ODBC connection.

Does anyone have any ideas why this file keeps growing like this? I do the import process every couple weeks to keep the file size down, and everything works okay, but I'd much rather not have to worry about it.

Thanks,

-Craig Coleman

Link to comment
Share on other sites

  • Newbies

I've gone through the indexes with a fine-tooth comb, only enabling exactly what i need. As far as PDFs go there are about 100 PDFs in there, but they don't change often and the Accounting file has the same things. Even deleting these PDFs doesn't help any.

Link to comment
Share on other sites

14,000 records really isnt a significant amount of records; especially to warrant that type of bloating behavior (30GB). Hmmmm...

How many fields do you have indexed? Are there lots of TO's in your solution?

Link to comment
Share on other sites

  • 7 years later...

Hi ZPData and welcome to the FMForums,

Please finish your profile by filling in what version of FileMaker, Operating System and Platform you are using.

 

Link to comment
Share on other sites

On 4/15/2016 at 2:53 PM, Lee Smith said:

Hi ZPData and welcome to the FMForums,

Please finish your profile by filling in what version of FileMaker, Operating System and Platform you are using.

 

Done!

Link to comment
Share on other sites

I would suspect there is some kind of BLOB (container) info that is inadvertently being imported in.  Importing PDFs or high-res images are the typical culprits.  But without seeing the file, I wouldn't be able to pinpoint this out exactly.  

Although you don't suspect so, can you check the records that DO have container data in them, and compare the file size using the GetContainerAttribute (field; fileSize) function.  Even sort the records by this number, and take a look at the larger ones?  

Have you compacted or done any maintenance on this file? 

Link to comment
Share on other sites

I am 99% sure it is nothing to do with containers.  There is nothing going on with containers that I am aware of. There is no storage going on of anything in containers...

I have compacted which cuts down by 1.5GB.  I have recovered.  There is a long standing issue on recover (which we are ignoring and continuing to use the unrecovered file).  The file when recovered is the same size and compresses the same as the unrecovered file (so recover is not fixing the issue).

My next move it to delete tables in a copy of the database and see if a particular table has a big effect.

Link to comment
Share on other sites

  • 2 weeks later...

I solved the problem.  Wasn't corruption.  I had a script running for the "Today" FileMaker 6 problem.  This script updating 300k records every night (server side script).  I had introduced an audit log into the databases and the logs were increasing by 1GB / month.  See https://www.excelisys.com/fm-tips/filemaker-pro-13-tip-n-trick-easy-bake-filemaker-pro-13-audit-trail/

Amazing how tiny log additions (100 bytes maybe) added up to 1GB per month. 

Link to comment
Share on other sites

This topic is 2913 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.