Jump to content

record lock problem in multi user senario


kyle

This topic is 7921 days old. Please don't post here. Open a new topic instead.

Recommended Posts

oh those record lock problems... I thought you meant like when you intentionally lock records (like invoices) so they can't be modfied after they are processed.

That article is very helpful though... thanks!

--

Jason Wood

HeyWoody.com

Link to comment
Share on other sites

Record looking is a good and necessary thing in multi-user -- only one user should be editing a record at any one time.

What is the problem you are having?

Link to comment
Share on other sites

I believe the problem they were referring to is if you run a script on multiple records, some records may fail to get updated if other users are "using" them.

Usually if I have a script like this, I set up my records to remember whether they've been processed or not. This way it's easy to find records that haven't been processed so that the script can perform it's function again.

eg: Process invoice... if someone is modifying one of the products, it's inventory won't get updated with my process invoice script... but the next morning, when invoices are printed, someone runs a daily script which processes all of the invoices again just to be sure none were missed. Each line item keeps track of not only the qty sold but also the qty processed (this way you never have to worry about subtracting from inventory more than once).

--

Jason Wood

HeyWoody.com

Link to comment
Share on other sites

hi, jason and vaughan ! i know record lock in a multi user senario is good thing. though since when i excuting a certain script and if user is on some record which i need to use for my script step, that particular record will be not updated without warnning. so let's say i use status(currenterror)=301 to trap that particular record and save it at the error log file. so i can use it later for updating the left out record from my previous script. some people said i can update whatever records which were left out and saved at the error log file when i close the db.

but then what will happen if any user want to see the report before i close the db. then certain record which weren't updated will be on the report with datas which weren't updated. or should i check the error log file before i run the report script and if there is any update it first and then run the report script ?

see what i'm talking about here is if we update the record (which were left out from any script action) when i close the solution, what is going to happen between the time that script was excuted and the time i close the solution ? we will have a record that has inaccurate data.

how do we handle this problem ?

i guess not only the report, even regular records in a list, there can be a record that is not updated when user is looking. so even when i want to look at the certain file's whole record, i need to check if all the records were updated by checking error log file. am i right on this ?

also here is another question. here is a example of my script B)

first script step from first file will save some value in a global field and then it will go to 2nd file and set that global value into certain field and then it will go to another 3rd file and excute the import script.

so there are 3 scripts in 3 different files. so senario is there can be a user looking at any of those 3 file's record at any time. so i guess i need to use status(currenterror)=301 script step to trap the error on each file's script. am i right so far ?

now how do i handle this ? do i need to attatch status( currenterror)=301 to every script in each file's script ?

or is it better whenever i encounter 301 error , just stop excuting the script and do it on later time when that record gets free ?

thanks,

p.s. --- jason thaks for the invoice reply.

kyle

Link to comment
Share on other sites

Let me share how I deal with the problem. I don't know if it's better or worse than trapping and logging the error.

My "Process Invoice" button goes through every line item on the invoice and subtracts the items from inventory (or adds them if it's a return). In the line item, there is an lineQtyInventoryProcessed field so that the line item always knows if it has been processed or not. There is even a visual flag -- If lineQtyShip = lineQtyInventoryProcessed, "PROCESSED", else "PENDING".

If someone changes their mind at the POS and a lineQtyShip is changed after processing, you just have to process it again. ***The script only acts on the difference between qtyShip and qtyInventoryProcessed (I call this lineQtyError) so you never need to worry about subtracting inventory twice.***

The next day, when someone runs the daily reports, they run another script which processes all of the days invoices again. This may do nothing (if all went well the first time), or it may catch line items that were not processed or changed after being processed.

With this scenerio, you never worry about which records were processed... you just process them all again because each record knows whether it needs to be processed or not.

Since you were talking about dates, I'm not sure if this would work for you... since the script might do something different depending on the day you run it? Depends on your application.

Link to comment
Share on other sites

hi, jason ! thanks for detailed explaination on invoice issue. though my question was for record locking problem causing some records fail to update in multi user senario. sorry about the confusion that i caused.

so any thought will be appriciated.

thanks,

kyle

Link to comment
Share on other sites

Yes I realize that. My private message to you was about intentional invoice locking, but what I am describing here is a script which fails to process inventory when other users on the network are modifying a product record, which is the topic of this forum.

It might seem like a different problem in your setup because I am using multiple databases (maybe you are only one). When a user clicks "process invoice" in Invoices, it goes through Line Items and adjusts inventory in Products... but it will fail to adjust the inventory of a product if someone has clicked in that record somewhere else on the network... but no worries because my line items remember whether they have been successfully processed or not, so they can be batch processed again later to catch those that are missed. The difference between this and the suggestion from FileMaker now I believe is just that I don't require a second log file.

Does that make more sense?

Link to comment
Share on other sites

This topic is 7921 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.