Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 6638 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

Hi all,

I have a script that does an Update Matching Records import from another FM file. After the import I need to have a found set that shows records that have been added and records that have had changes made in any of the fields designated in the import.

The only found sets I can easily come up with are:

Newly created records, by basing the found set on a Timestamp, or

Every record that was newly created in addition to every record that was keyed to the imported records.

I hope that I'm overlooking a simple solution.

Thanks in advance for any help offered!

Posted

I have a script that does an Update Matching Records import from another FM file. After the import I need to have a found set that shows records that have been added and records that have had changes made in any of the fields designated in the import.

Before I make a test - imports does always make found sets, isn't it how it usually works?? Where it gets trickier is to sift out the ones just added, leaving out the updated!

--sd

Posted

I believe if you have a :):CreationDate>Autoenter creation date and a :):ModificationDate>Autoenter Modification date fields you can then check Update existing records and choose to have modification date and creation date autoentered when changed/created. You could then set your search to find get(currentdate) in each field and find the added/updated records.

Posted

I believe if you have a :):CreationDate>Autoent er creation date and a :):ModificationDate>Aut oenter Modification date fields you can then check Update existing records and choose to have modification date and creation date autoentered when changed/created. You could then set your search to find get(currentdate) in each field and find the added/updated records

Unfortunately, that approach only solves the creation of a found set of newly created records. The modification TStamp is updated in EVERY record from import because FileMaker considers every record to have been modified (at least as I understand it)

For example, if I have a record with an address field that has a value of "100 maple street", during the import, FileMaker replaces the contents of that field even if the source record has "100 maple street" as the value of the field, therefore FileMaker sets the modification TStamp.

Again, this is how I understand it. If this isn't the case, I'd appreciate some clarification to help me understand this better.

Posted (edited)

Where it gets trickier is to sift out the ones just added, leaving out the updated!

That's actually the easy part. I just set a global to current TStamp at the beginning of the script and key that to a self-join with a :less: creation TStamp. Then I just GTRR in the same layout and I have my found set of new records from the import.

Edited by Guest
Posted

The only thing I can come up with is to create a new set of fields in the destination table, import the data into them, compare those data with existing data, flag records with differing data, update the records and then find the flagged set.

Please, someone tell me there's a better way! :confused:

Posted

This is a very important question you raise here! The solution must be, to do it between two tables, so if it's an external tab-delimited or so needs processing, is it going to be imported to the sender table first.

I have in my template put in comments to what is acheived where and when, and when I refere to a Sumware approach, is this the one:

http://www.sumware.net/robfm/savingfoundsets.php

Which could be found descriped in "Special Edition Using FIlemaker 7" on page 441, or in the fm8 version of this book is it 467. If a GTRR, no matter if it's a (FS) or (SO) doesn't make a switch in focus, is it because nothing is related. I use here Get(LastError) xor 1 ...because it's pretty irrelevant to know the error code actually is 103, just to know if it isn't 0!

There are indicators leading to recursive scripting, but I didn't bother wrapping it around, or break out common lines into a subscript although it pretty obvious.

--sd

importtesting.zip

Posted

Hey Søren,

I've been experimenting with your example file on and off all day long and I can't figure out where you're going with this. It seems like a very complicated way to show a found set of new records.

As I mentioned earlier:

That's actually the easy part. I just set a global to current TStamp at the beginning of the script and key that to a self-join with a creation TStamp. Then I just GTRR in the same layout and I have my found set of new records from the import.

The hard part was ending up with a found set consisting of:

1.) Records that were added during the import, and

2.) Records that changed during the import

I solved that by importing into a set of utility fields that are used to compare current data with imported data, pretty much as I described earlier.

So, where were you going with your example? I have a feeling that I'm missing something.

Thanks, I appreciate your help!

Posted

I have something of a grudge against the use of repeating fields, global fields as well as tagging/flagging records.

The hard part was ending up with a found set consisting of:

1.) Records that were added during the import, and

2.) Records that changed during the import

Isn't it two different set's?? Otherwise is it default behaviour!

--sd

Posted

I have something of a grudge against the use of repeating fields, global fields as well as tagging/flagging records.

To each his own...I say that if it works and it's simple, then use it. :

Isn't it two different set's?? Otherwise is it default behaviour!

Yes, it is two found sets all grouped into one. That's a requirement from my client.

No, it's not default behavior. Default behavior is returning a found set of every record that was available for import from the source. This is what I was asking about earlier, whether someone could confirm or deny what I believe happens during an import that updates matching records with the option checked for adding new records. From the tests I've run it appears that FM updates an auto-enter modification timestamp in existing, matching records even if there were no changes to the data. I would imagine that this behavior is because FileMaker doesn't compare source data to destination data in this type of import, it just imports the data from the source and that's what triggers the updating of a modification timestamp.

What do you think? Even though I've already solved the problem and delivered the solution, I'm still open to hearing a better way to do this.

Posted

From the tests I've run it appears that FM updates an auto-enter modification timestamp in existing, matching records even if there were no changes to the data.

Yes I can confirm this! But it's easily circumvented. Make you timestamp field an autoenter calc with the update option chosen:

Evaluate ( Quote(Get ( CurrentTimeStamp )); [field2;field3])

Where field2 and field3 are the fields to monitor changes in. So you just save the current timestamp in a $variable at scriptstart before the import and make a search for those larger than or equal this value!

--sd

Posted

Evaluate ( Quote(Get ( CurrentTimeStamp )); [field2;field3])

That's what I needed.

Many thanks!

Posted

I don't see how that's going to work. Importing with 'Update Matching Records' seems essentially the same as 'Replace Field Contents'. The field is considered modified, even if the new value is the same as the old one.

Posted

Here is another solution you might want to try. Add a number field to the table you are importing to, call it import_cnt. After the import use the replace function and specify calculation using:

import_cnt = import_cnt + 1

All the newly added records will = 1 and all the previously added records will be 2 or greater. You can also use a loop and set field if you like.

Posted

Since it's easy enough to find the newly created records, the only problem is to be able to find which existing matching records are not really changed. I believe the only way to do this is to do a field by field comparison. There are a couple of ways I can think of to do this without resorting to brute force, but it depends on how much data per record you are importing.

If you have less than a couple of hundred characters of data per record to import, then you could perform a dummy import first. Essentially, you create a calculated field in both files that concatenates the match field and all of the import fields. Then, you import matching (without adding new records) using this new field as the match field, and you only import dummy data into a single dummy field (it could be a global for that matter). The result of this operation, is a found set of matching but unchanged fields. You can then set a flag field in all the records of this found set to indicate they are unchanged, and then go on to do the actual import based on your original match field.

The other method is to import into a separate table and then compare the individual fields via a relationship on the match field. Then, use a looping script to update and add the new records.

Posted

Comparing fields is how I ended up handling it.

The only thing I can come up with is to create a new set of fields in the destination table, import the data into them, compare those data with existing data, flag records with differing data, update the records and then find the flagged set.

Comparing about 20 fields in about 7000 records (average import size) only takes about 10 seconds on an Intel Core Duo Mac Mini. It may not be elegant, but the client can now do in a couple of minutes what used to take a couple of weeks.

This topic is 6638 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.