Jump to content
View in the app

A better way to browse. Learn more.

FMForums.com

A full-screen app on your home screen with push notifications, badges and more.

To install this app on iOS and iPadOS
  1. Tap the Share icon in Safari
  2. Scroll the menu and tap Add to Home Screen.
  3. Tap Add in the top-right corner.
To install this app on Android
  1. Tap the 3-dot menu (⋮) in the top-right corner of the browser.
  2. Tap Add to Home screen or Install app.
  3. Confirm by tapping Install.

Featured Replies

I know I have added this to another post in a diff. area but I think to get an answer I need to post an actual question here.....

My script is bogging down and I'm not sure why. I'm using a loop and have cleaned it up a bit so that it's all in 1 script. (Previously I had a "Preform Script" step in my loop)

table::list

table::isbns

table:response_calc

table:response_text

All are global...... as are the others needed... wondering if that's the problem ??


.......first part to get a list....

Set Field [table::list .....]

Freeze Window  (Doesn't seem to work for me...)

Loop

  Set Field [table::isbns; If(Length(table::list)>127; Left(table::list; 127); table::List)

  Set Field [table::List; (remove first set of data)]

  Set Field_A [ ...calculation... ]

  Set Field_B [ ...calculation... ]

  Set Field_C [ create's a url ]

  Set Field [ table::response_text; table:response_text & table::items calc ) Puts a calculation field from the URL section into a text field... adding to it each time.

  Pause/Resume Script [Duration (seconds) .5]

  Exit Loop If (length table::List<= 0)

End Loop

.........



It starts off fine but then seems to slow down more and more........

Any ideas as to why??

Not sure what your question is. Pausing a script unfreezes the window. Slowing down could be due to the calculations getting more intensive as the loop progresses.

1. I generally like to put the exit condition at the top of the loop. Why do anything if the exit condition has been met?

2. Freeze window should work. For faster performance enter form view before beginning the loop. Return to list or table view after the loop if that is what you want the user to see when done. Agreed that you should leave out the pause.

Edited by Guest

  • Author

1. I generally like to put the exit condition at the top of the loop. Why do anything if the exit condition has been met?

[color:red]I need to set the fields & get the data before exiting the loop and then exit after I get it all.... If at the front of the loop it wouldn't get the last set of data would it?

2. Freeze window should work. For faster performance enter form view before beginning the loop.

[color:red]I did

.....Agreed that you should leave out the pause.

[color:red]i had it in there in case the requests were made to quickly

To Comment:)

I have considered the calc.'s being the problem. The only one that I think could possible be getting more complex is the last one that keeps adding the new data to a text field. The others are putting in the same data over and over to create the URL.

I'll add a couple of things:

It may be faster to use variables to store temporary data rather than fields. Set the fields at the end, or as needed.

When working with a list, GetValue is an easy way to get a particular list item. You can increment a counter and get the "nth" value. This also means you don't have to do a separate step to dwindle the list. So your script could look like:


Set Variable[ $itemList ; List( table::field ) ]

Set Variable[ $itemCount ; ValueCount( $itemList ) ]

Loop

  Set Variable[ $i ; $i + 1 ]

  Exit Loop If( $i > $itemCount )

  Set Field [table::isbns; GetValue( $itemList ; $i ) ]



  Set Field_A [ ... ]

  Set Field_B [ ... ]

  etc.

End Loop

If fields A, B, C are only used to assemble the URL, i.e. they are "scratch" fields, then as I said you might want to use variables there rather than fields.

  • Author

HHhhhhuuuuuuuuuuuummmmmmmmmmmmmmmmmmmm .... time to play and give it a new mod.

PS - Good to see you and Comment again. You both have been GREAT help to me in the past !!

I am afraid I cannot say much more without knowing what the calcs do. I would try to use script variables instead of fields - that should help the speed issue somewhat. I am not sure if form view and/or freezing the window have any impact here, since you are not looping across records.

Off topic:

Putting the exit condition as the first step in a loop *is* good practice. The condition must be formulated in such way that the loop exits after the last iteration has been run.

EDIT:

Written before reading Tom's posts.

Edited by Guest

BTW, since I have a vague idea what this is about, I'd like to point you to this:

http://fmforums.com/forum/showpost.php?post/362940/

  • Author

I currently import the XML response's but that is a bit slow.... trying to speed things up!

IF I do the link correctly....LOL

http://fmforums.com/forum/showpost.php?post/362825/

And THIS is fast? :)

  • Author

LOL.... Yes.... i think it will be. I can store the responses together and do a single import at the end.

Guess we'll see !!!

Seriously, I think you should investigate what slows down the XML import. IIUC, the server response is a constant factor, so the question is what happens at the import - validation, calculations, lookups, indexing, etc. Have you tried importing into a temp file?

  • Author

Seriously, I think you should investigate what slows down the XML import. IIUC, the server response is a constant factor, so the question is what happens at the import - validation, calculations, lookups, indexing, etc. Have you tried importing into a temp file?

[color:red]No I haven't.....

Point taken............ I'm looking over it again since the use of variables is a good idea but still something I'm fairly new at.... not one of my few stronger suits...LOL

AAaaaaaaand since you have an idea of what I'm doing....

Each import of 10 matches existing records and updates them.

There's some validated fields in there

Doesn't preform auto-enter

Does have a good bit of recalc. to reset everything for the next set of 10

  • Author

Is it possible to set "Pause/Resume" to pause for less than a second?::)

On just some VERY basic record updates I'm pulling down data to fast .... well for what they want. I get about 150/minute responses instead of the 60 allowed. I've tried going with .05 or the like but it changes my responses to about 1 every 2-3 seconds................

I can address this point. Pause works for fractional times; this is not where the slowdown is occurring. In retrospect my advice to concactenate was not good advice for larger datasets.

The speed slowdown is because Filemaker is poorly written. Let me elaborate.

Let's say I take a single Amazon ECS response in OfferFull mode for a single item. It's 24K long.

If you take this string, and concactenate it to itself 500 times, Filemaker requires an unbelievable 14 seconds to perform this operation. When you execute 1000 times, it requires 129 sec (final string size = ~24MB). I am not making this up!

It would be very interesting to see a benchmark between even Microsloth products such as Access and SQL server vs Filemaker in key areas such as sorting, reindexing, etc. In this case, VBA is over 300% faster, concactenating 1000 strings in 36 sec, 500 strings in 10 sec using

Sub Rectangle1_Click()

Dim ss As String

Dim i As Integer

MsgBox Time

For i = 1 To 1000

ss = ss & Space(24000)

Next i

MsgBox Time

End Sub

That isn't particularly impressive either, but is an interesting aside.

FWIW, the reason I stayed away from inbuilt XML import a few years ago for this application, was that it *crashed* the entire FM application after a couple imports. I'm almost certain they've fixed this bug since then (Filemaker 8.x?). In no case should a data import be able to crash an application. Of course, that was unacceptable since in FM an application crash can corrupt the DB itself, and at least back then, a crash messed up the licensing so it thought a copy was still running, requiring computer restart to re-open FM.

If you want to do the import as a single file, I might suggest another means. After each request for 10 items, export the resultant XML to a file with append. Of course, in Filemaker's limited scripting language, append isn't supported, so use ScriptMaster or another file plugin to do this. See Append To File example. Be sure to use Write to File before entering loop to write out an empty string to clear file between requests (or write XML headers). Then you can use Filemaker's XML import functionality on the single resultant file.

It's also good to include support in code for when Amazon's server is down, I just have it wait a while before issuing further requests.

One other suggestion: Try creating a new table in your DB, with fields 'timestamp' and 'script step description'. At several places in your script, go to this table's layout and insert a new record and set timestamp = get(currenttimestamp) and set 'script step description' to a description of the strip step and the record number being processed. Then have script go back to the layout with the data. Put all this in an IF.. THEN with IF Mod(get(Recordnumber),500))=0 so that it only logs once every 500 records to the 2nd table. You can then analyze this table to see whether it slows down. Also include the length of the concactenated field in the description when logging.

Using variable rather than global field, I found that by the 2500th join, speed had slowed to 0.59 sec per concactenation (24K concactenation).

You might also want to try appending to file only after every 100 requests, since the concactenations will not have slowed much until the variable length is quite large. How many BASINS/ASINS must be looked up? I'm assuming DB contains a few million? Is the matching BASIN field you're XML importing into set to be indexed?

Edited by Guest

Create an account or sign in to comment

Important Information

By using this site, you agree to our Terms of Use.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.