Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

what should I do to make it faster while writing data onto table?


This topic is 5453 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

I'm trying to make filemaker writing data onto table every morning with script.

It takes about an hour to be done.

While it's writing data, I can barely use the filemaker but it's too laggy and slow.

Is there any easy techniques that I can try and make it fast?

What it's doing in script is just like this.

It repeats this transaction for about 10,000 times

.

Go to Record/Request/Page [First](

Loop

Set Field(Talble A;A)

Set Field(Talble B;B)

Set Field(Talble C;C)

Commit Records/Requests [No dialog]

Go to Record/Request/Page [Next](Exit Loop when its last record)

End Loop

Posted (edited)

Thank you very much for reply.

Let me tell you what the situation is now.

I download raw data(detailed statement) from a website every morning, and set the data on temporary table via import.

As soon as it finishes import, it starts writing data onto actual table which is connected by relationship.

Obviously, there's no record until it starts writing onto actual table. That's why I cannot do "Replace field"

The reason why I'm doing this is, because I want to set "1" to "check flag" at every record.

the temporary table has "check flag"

and "check flag" is set as "0" when its imported.

Just in case when it stops while its writing data onto actual table.

I can search only for "0" at "check falg" and restart writing data again.

so in fact, script step should be like this below

Go to Record/Request/Page [First](

Loop

If(Temporary Talble::check flag = 0)

. Set Field(Actual Talble::A ; Temporary Talble::A)

. Set Field(Actual Talble::B ; Temporary Talble::B)

. Set Field(Actual Talble::C ; Temporary Talble::C)

. Set Field(Temporary Talble::Check flag ; 1)

. Commit Records/Requests [No dialog]

End if

Go to Record/Request/Page [Next](Exit Loop when its last record)

End Loop

Edited by Guest
Posted

Why don't you import from the temp table, then? You can tell which records were successfully imported, because those that didn't will not have a related counterpart in the target table (if I understand your setup correctly).

Posted

Im sorry for lack of my explanation.

As you said I can tell what record is missing after import.

But I have to do calculation for some field when its writing data to actual table like this below

Thats why I cannot import data directly.

Go to Record/Request/Page [First](

Loop

If(Temporary Talble::check flag = 0)

. Set Field(Actual Talble::A ; Temporary Talble::A*1.05)

. Set Field(Actual Talble:B) ; Int(Temporary Talble::B*5/105))

. Set Field(Actual Talble::C ; Int(Temporary Talble::C*100/105))

. Set Field(Temporary Talble::Check flag ; 1)

. Commit Records/Requests [No dialog]

End if

Go to Record/Request/Page [Next](Exit Loop when its last record)

End Loop

Posted

Go to Record/Request/Page [First](

Loop

Set Field(Talble A;A)

Set Field(Talble B;B)

Set Field(Talble C;C)

Commit Records/Requests [No dialog]

Go to Record/Request/Page [Next](Exit Loop when its last record)

End Loop

Setting aside the issue of *why* you want to do this for a moment, there a couple of things you can do to speed up the process:

1) make sure the layout the script runs on is in Form view, and not List view. list view is slower because FMP has to update all of the records that are visible each time a record is committed. In Form view there is only one record visible.

2) Start the script with Freeze Window. This stops FMP from updating the display which speeds things up. The downside is that it will appear as though nothing is happening: this can be fixed by scripting a Refresh Window step every couple of hundred records or so:

Go to Record/Request/Page [First]

Freeze Window

Loop

Set Field(Talble A;A)

Set Field(Talble B;:B

Set Field(Talble C;C)

Commit Records/Requests [No dialog]

If[ Mod( Get( RecordNumber ) ; 500 ) = 0 ]

Refresh Window []

Freeze Window

End If

Go to Record/Request/Page [Next] (Exit Loop)

End Loop

Reduce the number (500) in the If calculation to update the window more often, or increase it to refresh the window less often.

Maximum speed occurs if there is no refresh, but no refresh looks as though it's hung.

Posted (edited)

Thank you for the reply, Vaughan!!!

Actually, I do put "Freeze Window" in the beginning.

I'm sorry to forget to put.

and I can check how many records are already written if I see how low the scroll bar is located.

Even when the window is frozen, you can see scroll bar is moving down.

so it goes like this below.

Is there any other idea to make this faster to write data onto actual table??

I appreciate to you guys.

Thank you so much.

Freeze Window

Go to Record/Request/Page [First](

Loop

If(Temporary Talble::check flag = 0)

. Set Field(Actual Talble::A ; Temporary Talble::A*1.05)

. Set Field(Actual Talble:B) ; Int(Temporary Talble::B*5/105))

. Set Field(Actual Talble::C ; Int(Temporary Talble::C*100/105))

. Set Field(Temporary Talble::Check flag ; 1)

. Commit Records/Requests [No dialog]

End if

Go to Record/Request/Page [Next](Exit Loop when its last record)

End Loop

Edited by Guest
Posted

Even when the window is frozen, you can see scroll bar is moving down.

Then you must be in LIST view - make sure your script forces the layout into FORM view.

Posted

Being in Form view and freezing the window can speed up the loop - but a loop is still the slowest method.

But I have to do calculation for some field when its writing data to actual table like this below

Thats why I cannot import data directly.

You can, if you define calculation fields in the temp table, and import those instead of the original data.

Alternatively, you could set the target fields in the actual table to auto-enter calculated values (and enable auto-enters during import).

The part that is still missing from your description is the relationship between the temp and the actual tables. What are you using for matchfields - does the data arrive with a unique identifier? And why do you need a temp table in the first place - why not import directly into the actual table?

Posted (edited)

Thanks for the reply=)

yes It has an unique identifier field in temporary table. the unique indentifier field is unique, and constructed with combination of some fields in the same record.

Thus, It can tell there is no record dupulicated.

I want to prevent dupulication records before it starts writing data onto actual table,

Dupulication alert pops up if there's any dupulicate.

Thats why I dont want to import data directly to actual table.

so there is two reason to do this procedure.

Reason 1. I dont want duplucated record on actual table.

Reason 2. There is some data which needs to be calculated, in which various of the other fields involved.

Edited by Guest
Posted

1. If the original source data contains a unique record identifier, you could import it directly to your final table. Set the field's validation to Unique, Validate Always - this will prevent importing the same record twice.

2. As mentioned earlier, you can add calculation fields, or perform auto-enter calculations during import.

Posted

Thanks for a comment!

I understand about calculation part.

So you mean I should prepare the temporary table, which has automated calculation fields.

So all I have to do is import raw data onto this temporary table and it gives all the calculated values for every each fields.

Then import data from this temporary table to actual table.

is that what you mean?

about unique identifier, raw data doesnt have unique data in each record.

thats why I prepare unique identifier field, which is combination of few fields in the same record.

Posted

Well, you could do that - but I don't see why you couldn't import DIRECTLY into the actual table. To prevent importing the same record twice, define a text field (in the actual table) with auto-entered calculation combining those "few fields" and set its validation to Unique, Validate Always.

BTW, I am not sure how can you be confident that the combination of fields will never repeat - but that's another issue.

Posted

Thanks for the comment.

I was off yesterday so I'm sorry for being late to reply.

I understand that it will be much faster to import data directly to actual table.

But here's another issue.

While writing data from temporary table to actual table, it's also writing to other tables related at the same time. That's another reason that I cannot import directly.

The advantage of importing data to temporary table first, and writing imported data to actual table is easy to organize everything.

For instance, if I want to apply calculated data to the field in another table, it's really easy to add a script step.

Also, it's easy to realize how many records has been already written even when errors occur.

So I can restart from the rest part of that.

Is there any other idea that you can think of?

Thank you very much for great ideas always !!

Posted

I know only what you what you have told me, so it's hard for me to tell what's best for your situation. A temp table is often convenient, esp. if there is a lot of processing to do, and you don't want to litter your real tables with calcs and auto-enters.

However, once the data has been imported into the temp table and processed, I believe it should be then IMPORTED from there to the real table/s. I don't know how many records you are dealing with, but I suspect you'll see a significant increase in speed - and there should hardly ever be need to restart from a point of failure. Even then, it's easy to find out which records were imported and which not - see my second post in this thread.

Posted (edited)

In my case, so do you think its better to import data to the temp table first, and write those imported data to the real table??

Because, I need prosessings to do such as,

1. Calculating, and writing those evaluated values onto some fields in many tables at the same time.

2. Making it visible how many records are already written, and rest of un-written records.

If I import data directly to the real table, the more table I want to write on, the more times I have to repeat doing import. Is that right?

Edited by Guest
Posted

In my case, so do you think its better to import data to the temp table first, and write those imported data to the real table??

As I said, I don't know enough about your case to tell. The only thing I am fairly certain of is that you should not move out the records by looping.

2. Making it visible how many records are already written, and rest of un-written records.

How many records are you talking about here? And how often do you need to do this?

Perhaps you should make a test run and see how long it takes to import from the temp table into a real table.

If I import data directly to the real table, the more table I want to write on, the more times I have to repeat doing import. Is that right?

Yes - but hopefully you are not importing ALL records into ALL tables?

Posted (edited)

As I said, I don't know enough about your case to tell. The only thing I am fairly certain of is that you should not move out the records by looping.

So you mean that I should use "Replace field" insetad?

How many records are you talking about here? And how often do you need to do this?

Perhaps you should make a test run and see how long it takes to import from the temp table into a real table.

I write over 10000 records to some fields on few tables. Every morning, I do this all at once with a script.

Edited by Guest

This topic is 5453 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.