Jump to content
Server Maintenance This Week. ×

Perform find does not perform


DPaquin

This topic is 1652 days old. Please don't post here. Open a new topic instead.

Recommended Posts

For some reason the perform find function always return the execution as the "record existing". The first execution, there is a record which is different than the one being found. The expected response was "missing record"

I've left the new created record in the table for the next attempt. The expected response was "existing record". This is what I've got as a response but do not trust it because I am getting that answer eve of there are no duplicates.

I've been trying multiple but it always goes to the else condition of the perform find function.

 

Screen Shot 2019-10-16 at 10.39.14 AM.png

Screen Shot noduplicaterecord-1.png

Screen Shot noduplicaterecord-2.png

Screen Shot noduplicaterecord-3.png

Screen Shot withduplicaterecord-1.png

Screen Shot withduplicaterecord-2.png

Screen Shot withduplicaterecord-3.png

Test.fmp12

Link to comment
Share on other sites

Not quite sure what you are trying to accomplish but your test after perform find (you should use Set Field instead of Restore) finds a record, so therefore, no error and the reason why you are getting that result.

There are other issues too, like setting up a loop counter, but not looping records.

And I don't understand why you are creating a new record.

Also, you are setting a field based on a variable, but you never set the variable ($EventNo).

Quicker help could come if you just explained in easier language what you are trying to accomplish, what results you expect.

Another tip, use comments in your scripts, so others can follow your logic.

Sometimes (especially for beginners or very complicated scripts) it's easier to build a script using comments to map out the logic, then putting in the script steps.

Edited by Steve Martino
Link to comment
Share on other sites

Thanks Steve for your answer.

This is a very short version of a long script. I've tried to remove all unnecessary steps but seem to have kept some like the loop one.

The purpose of this script is to copy files recorded on SD cards which had not been previously loaded.

The names of the files are always 00000.mts, 00001.mts, 00002.mts, 00003.mts.... 

First step is to make each video file name being unique. To perform this action I use an applesccript function which always added the date and time the file was created. Then 00000.mts becomes 00000.mts20190728030252, 00001.mtsxxxxxxxxxxxx....

All of those names are being kept in the infile table until the project is completed.

The purpose of the find function is to make sure I will be keeping only one instances of each files recorded on the SD cards.

In the present exemple, there is only one file which have been loaded 00001.mts.

First time the script is executed the file 00000.mts20190728030252 does not exist therefore is to be loaded on the the table.

Second time the script is being executed the file  $foundfilemts contains 00000.mts20190728030252 and a find is being performed with the criteria infile::fileUniqueName equal to $foundfilemts. 

In the case the file is found (already exist) only one record belonging to that file will be kept, the other will be removed.

I am not always using the same SD cards for recording videos, therefore when collapsing videos together the same name will be found. To avoid deleting video files inadvertently I add the date and time information to the each video name file.

The expected behaviour is to by continue the step if the file does not already exist and to only keep one if already exist.

Removing duplicate records will be the next function I will be coding.

I've looked at the use Set Field instead and I am not sure how I will be able to accomplish what I need to do. THANKS!

Edited by DPaquin
Link to comment
Share on other sites

This is a side note, since I can't follow your description. If you want to make sure there will be no duplicate values, validate the field as Unique, Validate always. This will automatically skip duplicates when importing, and generate an error message if you try to create a duplicate by modifying an existing record or creating a new one. Also make sure neither your users nor your script are allowed to override this restriction.

 

Link to comment
Share on other sites

Thanks Comment!

I will look at changing options in that table.

I need to ensure records are not written when duplicates. Those possible duplicate records need to be removed without any human intervention (ie in the background).

 

Link to comment
Share on other sites

1 hour ago, DPaquin said:

Those possible duplicate records need to be removed

What I am trying to say is that those records should not be created at all. If you are creating them by script, you can trap for an error when attempting to commit, say:

Set Error Capture [ On ] 
Commit Records/Requests [ No dialog ]

This will generate an error code 504 ("Value in field is not unique, as required in validation entry options") - so you can follow this by:

If [ Get(LastError) = 504 ]
  Revert Record/Request [ No dialog ]
End If

 

Edited by comment
  • Like 1
Link to comment
Share on other sites

This topic is 1652 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.