Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 5847 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

I created a database to keep track of a couple hundred FTP accounts and it works well enough. I created a script in that database to generate an exported file that, with minimal tweaking, becomes an XML file that FileZilla can import as a bookmark list. (The "minimal tweaking" is basically doing a global search and replace in a text editor to insert line breaks, since FMP won't export a line break.)

To accomplish this I created a new table with three fields: XML Header, XML Body, and XML Footer. There is only one record in this table and that record contains the XML header and footer information. The script I created is to fill in the body field with the information from the FTP accounts table. My script:


Go to Layout ["Summary View" (FTP Accounts)]

Show All Records

Sort Records [Restore; No dialog]

Go to Record/Request/Page [First]

Loop

  Set Variable [$XMLexport; Value:$XMLexport & FTP Accounts::XML Export & ¶

  Go to Record/Request/Page [Next, Exit after last]

End Loop

Go to Layout ["XML Export" (XML Export)]

Insert Calculated Result [select; XML Export::XML Body; $XMLexport]



The "FTP Accounts::XML Export" field is a calculation field that aggregates the FTP account information into a chunk of XML code for FileZilla. For example, if the FTP account information was:



Account Name: My FTP Account

Server: ftp.nowhere.com

Username: anonymous

Password: mypassword



The "FTP Accounts::XML Export" field would turn that information into what FileZilla needs:







  ftp.nowhere.com

  21

  0

  0

  anonymous

  mypassword

  1

  0

  MODE_DEFAULT

  0

  Auto

  0

  My FTP Account

  

  

  My FTP Account





With that in mind, you can see what the script is doing. It puts the contents of "FTP Accounts::XML Export" into the $XMLexport variable, then appends "FTP Accounts::XML Export" from the next record to the end of it and proceeds down the line. Eventually the last record has been processed and $XMLexport contains all of the "FTP Accounts::XML Export" data. It then moves to the XML Export table and inserts the collected data into the "XML Export::XML Body" field. (The next step, which I haven't put into the script yet, would be to export that single record from the XML Export table which would contain my nearly-complete bookmarks file.)

Here's where I run into the problem. The script runs quickly and smoothly while it collects all of the information, even with several hundred records. But when the script gets to "Insert Calculated Result" step, FM hangs for a VERY long time, as in several hours. Even if the script is processing only a handful of records - say, five or ten - there's a noticable hang of about five seconds during the "Insert Calculated Result" step.

After FM recovers the data is in the field as needed but, for as long as you're in the "XML Export" table, FM is extremely laggy. Once you move into the "FTP Accounts" table the database runs as quickly and as smoothly as ever. Move back into "XML Export" and it's lag galore.

Have I hit upon some limitation of FM? I have a completely unrelated database that has text fields which are holding dozens of pages of text and there's no lag whatsoever so I'm not sure where the problem lies. I've also tried using "Set Field" instead of "Insert Calculated Result" and got the same lagging issue. I've even tried not using a variable, having FM flip between the two tables to move the data one record at a time, but the lag still happens.

Any ideas as to what's happening?

Posted

Well, I found a workaround but I'd still like to know what's causing the lag.

If I put a "Freeze Window" command at the top of my script, everything works perfectly and without the lag. I also added functions to my script which erase the "XML Body" field when it's done exporting which will avoid the lag if I were to enter that table while the data was still there.

Posted

Maybe, I missing something, but I think this would be easier as an inserted calculated field in the field definition.

After that all you should need to do is export the appropriate field from which every set of records you decide.

I don't see the reason to generate the results for the entire table every time.

This topic is 5847 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.