Jump to content
Server Maintenance This Week. ×

Which is faster: One file or many files


This topic is 7362 days old. Please don't post here. Open a new topic instead.

Recommended Posts

A friend who develops in other programs tells me that for a large scale, multiuser project, it would be best to use one file, so as to maximize speed of use. Never having used any other program, I assume the same concepts apply as in Filemaker.

In Filemaker, I concieve of separate files that each contain records of a different catagory. I wouldn't necessarily consider putting customers and inventory in the same file. But maybe I could. Would this make the solution faster? I could use self-relationships for all my portal, etc. It would be one huge collection of fields (certainly one advantage of multiple files is it's easier to keep track of everything and to conceptualize solutions).

Any limitations to this approach? What makes Filemaker run faster/slower in terms of design approach?

Link to comment
Share on other sites

Having a huge collection of fields in one file would most definitely make your solution slower rather than speed it up.

Relational databases are the way to go for large scale solutions.

Think about large scale solutions trying to deal with all the calculations in one file - not good.

I'm sure some of the bigger gurus in here will give you a more thorough answer than this but I'm sure that they'll agree with basic idea of my response...

Best regards

Ed

Link to comment
Share on other sites

Hi Jason,

This would be a nightmarish structure to develop for sure, and very limitative in fact, on the user side.

Relational design is safer, logical, smooth and dynamic, so that it particularly fit situations where you need your solution to follow the company evolvements.

For sure, speed can be problematical in a True Relational scheme, as it would lead to an accumulation of unstored calculations.

Once the relationship diagram is elaborated, you'd necessarily use scripts to transfer and index related datas.

Link to comment
Share on other sites

Speed often has more to do with how calculations are defined, how layouts are designed, how navigation, Finds and scripts are performed than just file size or how many files there are.

One thing that makes a big difference is handling navigation so that a user can quickly find only the subset of records they need to see. My experience is that a simple relationship is faster than a Find, so subsets they often need to see, such "child records for this customer," should be visible in a portal, and/or isolated in the child file via a relationship, in list view.

Complex calculations using related fields and summaries should not be used in List views unless absolutely needed, best with only a subset of records. It is better to have a button for "Totals" than to make users wait every time they view a list.

Finds on Unstored fields should be avoided whenever possible. It is much faster to build a routine that uses global fields for criteria entry, does the Find in the child file, captures the IDs, then uses a relationship on those IDs (pasted into a global field), to return to the main file. Sometimes the routine requires a Loop in the main file upon return, to filter out some records. It can be difficult with complex Finds, but with simpler Finds it can be many times faster than a Find on unstored related fields.

All scripts should be optimized for speed. Loops especially need Freeze Window, View as Form (many times faster than List).

As Ugo said, redundant data is allowed if it is really needed for speed. But it must not "bend" relational integrity; you have to make sure it's appropriately updated. In many cases, it's never edited, so that's not a problem.

Link to comment
Share on other sites

Jason:

Multiple files definitely makes for a faster user experience. Some of the solutions I have designed contain upwards of 2 GB of data; however, the largest file in any of those solutions is around 600MB. The longest the user ever has to wait in this setup is if they are sorting that entire 600MB file (and displaying it as a list), which happens infrequently. Because it is only a subset of the entire solution, that large file can for the most part be ignored, and the users find themselves loading much smaller amounts of data over the network.

-Stanley

Link to comment
Share on other sites

This topic is 7362 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.