Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 5881 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

Hello,

My solutions is a 28 mb database that loads painfully slow (the clone without records is about 5 mb). I'm hosting the database and only have 2 clients. The host machine is mac pro (3 ghz quad core, 16 gb ram) with FMP 9 sharing via TCP. This solution takes about 10 minutes to initially load with both the server and client having at least .5 mbs upload/download speeds. When loading locally, the same computers only take about 5 seconds. The client machines are both Macbook Pros (at least 2 ghz with at least 2 gb ram) with FMP 9.

I understand that the following question has a lot of variables, so I'm not looking for a definitive answer, but even some good resources would be appreciated. How do I make this database load quicker? I've read other forums and looked at the various reasons that a database is slow and feel that my structure is optimized. I don't know much about networking. I've tried a trial version of FMP server without any apparent improvement. Switching to a windows machine is not an option.

Ideas? What other info do you experts need? Thanks!

Posted

The basic principle is: minimise the amount of data transferred. Data as in fields and records, and data as in layout elements.

This means:

removing graphics from layouts

minimising the fields (especially summary fields) on layouts

using form view instead of list and table view

don't sort record sets

... and so on.

If the clone is 5 MB alone there could be ample opportunity to optimise the database.

Start by making copies of the opening layout and removing *everything* except the basic fields.

Check the startup script for any processes that use a lot of data, like sorts etc.

Posted (edited)

Thanks Vaughan. That's helpful as I do have some of the records being sorted when opening. I do already use almost all form view with exception to the list of clients that opens at the beginning. I'll try removing that to see what progress I get.

I have my solution trimmed down to 83 layouts, but really cannot do without those. I've also trimmed down the graphics as much as I can.

Regarding summary fields, I only have one and rarely use it. It is not on the layouts that I use most.

Is there another way to host/serve that would be faster than the TCP. I've seen mention of server side processing, but do I need to do something to make that happen? I've seen mention of things like Citrix, would something like that solve my problems?

Other ideas to increase speed?

Edited by Guest
Posted

"Is there another way to host/serve that would be faster than the TCP."

FM Server *only* uses TCP now. The alternatives in older versions were (IIRC) AppleTalk on Macs and a Novell Netware protocol for PCs.

Citrix is a remote terminal solution where the processing happens on a machine local to the server, with just the keyboard, mouse and video information being transferred to the remote user. It's complex and expensive and has its own issues.

I don't think you'll find any single silver-bullet to magically increase the speed. It's going to take a lot of work to optimise, including some detailed investigation and lots of testing to work out where the bottle-necks really are.

Posted

"I've also trimmed down the graphics as much as I can."

There is a scene in Red Dwarf where they have to abandon ship, and the Cat has got 10 racks of clothes to bring, all absolutely essential.

Try taking out *all* the graphics and test to see what happens. Use a duplicate layout to test with. Have no regard for appearance.

Posted

There is no mentioning of the degree of normalization performed, it shall not remain a secret that, inappropriate structuring of a solution has a lot to say about the swiftness or responsively felt.

Oftentimes is the hardware accused wrongly for the less skilled deployer whims and even unlucky choice of tools!

It's pretty saying that even some of the faster tools around, must stress/caution the importance of thinking this thru:

http://dev.mysql.com/tech-resources/articles/intro-to-normalization.html

http://www.bizjournals.com/phoenix/stories/1998/10/19/smallb4.html

...are you making your solution barking up the wrong metaphor here??

--sd

Posted

:-) I recall that scene. I'll try it without the graphics.

This solution that I'm working off of is actually the first solution I ever built and it has evolved over the past 16 months. Therefore, I'm sure that I have a lot of structural problems that beginners make. With that being said, I'm not sure where to start in re-structuring. Prior to the below post, I had never heard of normalization, which again means there is probably a lot I need to learn there. I intend to read those articles in detail. Thanks for the feedback...I am in this to learn (while salvaging some of the work I've done so far in my solution).

Regarding an earlier comment about avoiding "summary fields"...I'm assumed that that meant avoiding fields that are a summary type, but if I'm using a lot of sum functions in calculations (which I am), could that be causing it to slow down too? Looking back through my solution, I have a numerous calculations that use sum functions.

How about I open my structure up for some scrutiny, I have several child tables that have what I consider a lot of fields that maybe I should restructure. In fact, I have one table with over 600 fields. That particular table has 1 field for every question on that particular psychological test (close to 550 questions) with the remaining fields containing calculation fields to derive different scales from those items. The parent table contains my patients info and is a one to many relationship with the child table containing the patient's answers and scores on those particular psychological tests. Is that poorly structured? I guess I could have one field to hold all of the test answers with each answer in a record, but I'm not sure how I would accomplish all of my calculations if I did it that way.

Posted

Does your solution have tons of unstored calcs? Lots of table occurences? What kind of hard drive and network card do you have?

Posted

I have 39 tables and at least a couple hundred unstored calcs. My hard drive is is 7200-rpm hard drive Serial ATA 3Gb/s. I think my network card is 1000BASE-T Ethernet on the server side and client computer uses a wireless 802.11n connection.

I recently went through and changed most of the stored cals to unstored cals in hopes of making it quicker (I could be wrong).

Posted

Unstored calcs would slow it down when going to a layout where they are on. Second, have you tested and benchmarked when hardwired?

Posted (edited)

Yes, I've tried the laptop hardwired with no discernible difference. My transition from record to record and layout to layout isn't as much of an issue (2-5 seconds). It is the start up time that is my biggest concern (about 10 minutes recently).

I also don't have any issues or delays when in the LAN.

Edited by Guest
Posted

"It is the start up time that is my biggest concern (about 10 minutes recently)."

Check the file references.

Posted

I guess I could have one field to hold all of the test answers with each answer in a record, but I'm not sure how I would accomplish all of my calculations if I did it that way.

Exactly what needs to be done - You need to change a calc'field intensive behaviour to a subsummary reporting way:

...the reason is that beyond your obvious violation to common database principles, is it against the realm of the tool as well:

http://fmforums.com/forum/showpost.php?post/206543/

--sd

Posted

thanks! That looks like a novel approach that I should try someday, but it would require a lot of restructuring and changing a lot of calculations, so it's not something I'll be able to try quickly to see if it works.

Exactly what needs to be done - You need to change a calc'field intensive behaviour to a subsummary reporting way:

...the reason is that beyond your obvious violation to common database principles, is it against the realm of the tool as well:

http://fmforums.com/forum/showpost.php?post/206543/

--sd

Posted

Regarding the earlier suggestion to remove a reference. I tried reloading the database immediately after making the change and noticed no difference. I tried an hour later, and now it is loading in 4 minutes! I'm excited with this 60% increase. I'm not sure why it behaved like that, but am pleased.

Posted

Do instead blame Ryan Rosenberg for the clinging on to the spreadsheet metaphor, no matter how misleading it actually is.

The strategy is make a tool that in fact requires craftsmanship, appear as a no-brainer. You can here easily get caught in learning yourself bad habits.

It should be said unless you are really skilled, can't you really make two of the selling points work in your direction. I speak here of the tennis racket or porta-studio lies that claim that the ability to create runtimes and IWP proves serious ROI.

It can never be stressed too much there are horses for courses, and filemakers strenght is workgroups on the same subnet in a LAN.

--sd

This topic is 5881 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.