Jump to content
Hashir Raja

High volume of records

Recommended Posts

Hi, I would like to know of how well Filemaker can handle large amounts of data in a table.

Our current system which is not in Filemaker, gets record with passenger information through xml . We get records as soon as they are available , sometimes 10 to 20 records an hour. On a weekly basis we would have about 4000 records. 

How would I store these records so they can be viewed later on for reporting purposes? I can imagine how big the database would get within a year or two. 

There will also be about 40 concurrent connections through iPad . Employees that connect through iPad will only have access to about 2 or 3 layouts. 

Its a dispatch system. Will Filemaker be able to handle all these records? Also we get records a day or two in advance so I would only be showing with record that matches the current date. Will a find request to filter through a lot of records be rough on performance ? 

Edited by Lee Smith
Aligned the tax left

Share this post


Link to post
Share on other sites

FileMaker can easily handle this volume of transactions - one of my systems imports c100K records a week via XML and has up to 30 concurrent FileMaker Pro users.

How are the iPad people connecting? FMGo, WebDirect?

Share this post


Link to post
Share on other sites

Hey thanks for the reply. 

They will connect through fmp go.

In order to get xml import, how does that work? Is there a script I have to constantly run incase the vendor sends new data? 

Also, what is the best way to archive old data? And when does it normally need to be archived? I would imagine the only reason for that would be that you're running out of memory . 

Share this post


Link to post
Share on other sites

XML is one of the import formats FileMaker understands natively, although you may need an XSLT if the XML doesn't arrive in one of FM's two native FML schema.

There's no standard rule for data archiving, it depends strongly on the specific use. FileMaker can hold and index an enormous amount of data, but sometimes it's useful to offload old records in certain circumstances, such as frequently needing to do finds on unstored calculations, which can slow down as indexes grow. 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Who Viewed the Topic

    1 member has viewed this topic:
    jpscharf 
×

Important Information

By using this site, you agree to our Terms of Use.