Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 7126 days old. Please don't post here. Open a new topic instead.

Recommended Posts

  • Newbies
Posted

btw, I'm on Tiger, not Panther...

Does anyone have experience with gigantamongous databases?

I have a fairly beefy machine (PowerMac G5 w/ 4 GB ram, dual-2Ghz, SATA drives) and have had significant problems importing and working with ultra-large files. (My import is more than 10 Gb w/ more than 80 million records.) The files sometimes won't import at all, and even when they do if I try to index on a field it crashes the file and makes it unrecoverable.

I've called FM, but apparently I'm in uncharted territory here, and they are not able to help despite their best efforts....

Posted

I have not done this sort of thing in FMP 7, but I just finished the same idea in FMP6...In this case it was 1,000,000 records.

I believe that the problem is a combination of machine limitations and the program itself. An FMP7 file can hold 8 terrabytes....But when you go to do things, you cause the program to create internal indices etc., that boost the file size up and can cause a freeze or melt down if it hits machine limits or overall file limits for FMP. In this case, I do not think that it is the 8 terrabyte size that is the limiting factor but more probably either hard disk space or internal memory that is being eaten up when you try to do something. Note that this is a guess....Not a given.

For MFP6 I found that the program could comfortably handle about 300,000 records, so I worked around this limitation. Here, you may have to do something similar.

What I did do to get data in and benchmark this may help you to segregate data and find the limits.

My data came as a CSV file. Instead of importing it directly into the database, I dropped it on the fmp icon and let it make its own "no frills" database. I could search this file without self destruction. I then found sets of records and gradually added records to the "real database" until I found it lacked stability. I found a benchmark so to speak. From there I designed around it.

HTH

Dave McQueen

Posted

Filemaker does have its practical limits aside from the numeric ones. I have worked on solutions that were too top heavy (lots of clever code, checking routines, portals, plugins, etc.), that started to bog down with just a few hundred records. Before you get too far down the road to turn back, take a look at Servoy. (servoy.com). This may be the answer to your problem. Servoy has a developer environment very similar to FMP (they designed it that way on purpose), but uses databases that can far more readily handle the immense quantity of records you need.

My experience with FMP is that you can build "almost" anything with it, but sometimes you can end up with a beautiful solution that is glacially slow. You'd be amazed how many experienced developers will build a solution without ever loading it (adding a lot of records), to see how it is actually going to perform over the long haul. Plugins can be a real trap in this regard. They seem great on paper, but some of them can slow a solution to a crawl in short order. I believe in this case you may be in "almost" territory. Give Servoy a look and let me know what you end up doing.

I'd love to know what kind of DB you're working on that contains 80 million records!? I'm impressed!

This topic is 7126 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.