Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 5068 days old. Please don't post here. Open a new topic instead.

Recommended Posts

  • Newbies
Posted

[color:blue]I need some serious help with this! :

I have a rather complex database solution that I want to clone, copy then install onto a laptop that will be used remotely without internet access. The user of the laptop will read, modify and add new records. The user will not make any changes to the structure of the data.

The user will come back into our office once a week or so and hook into our network at which point we can make the transfer.

How in the world do I get the two databases to synch / merge the changes. The original database it is copied from will also be changed.

Primarily it is three files that will be modified... or um, three tables. There are signficant number of fields for each table..and also relationships.

ANY help would be appreciated.

Thanks,

JS

Posted

Hello James, your first post is one of the trickiest problems in database design.

Do not underestimate the complexity of database synchronisation!

Some options are available (such as SyncDek). All begin with defining business rules as to which edit is to be kept. (Don't underestimate the complexity of this either.)

I *hate* data synchronisation. I set up a system in FMP 6 for a client over 10 years ago. I'm still working on it.

The bottom line is: avoid it if possible. That's why remote access to databases was invented.

  • Newbies
Posted

Thank you for the response! I feared that would be the answer. Problem with FM is that their 'web enabled' database is severely limited in terms of scripts and function. I am thinking the only way to beat this is through a VPN of sorts... very highspeed so it can replicate the data with a host/server combo.

Posted

I do not think that you need to clone your database, but rather use the built in ability of Filemaker Pro to host a database that can then be shared remotely. This is different from web-publishing, where you open your database from a browser. Remote access is the ability to be online in a coffee shop and access your Filemaker Database at home by opening Filemaker Pro>Open Remote>type in your IP address> and whamo! you have your full database at your fingertips. Any changes you make are shared on the common database. This is essentially what Filemaker Server does, but Filemaker Pro, I am 99% sure, has this ability as well, although you have to leave Filemaker Pro running on your home computer so it can server out your data-base. If you do not have a static IP at your base-camp then you can set up a dynamic domain name server at dyndns.org. It is easier than it sounds. You may already be aware of all this though and have some special needs such as your field person will be in the Jungle with no internet, compiling data that he/she will bring home and want to plug in to your home database. In this case, it will be difficult if you are working on the exact same records at home. But if your field person is bringing in a whole new set of records that wont be overwriting, but rather adding to, an already exisiting set of records, then it is a simple matter of importing. And just set the Import Actions to "Add New Records" and you are set. You can even write a simple script to do this for you and make a button.

Posted

James,

To update from your laptop you would need a serial # for each record,

then you could Import records from the laptop by using a

match field and then add remaining records.

I know this is a simplistic approach, but depending on your

situation it might work fine.

Dean

Posted

It's not the importing of records that's the problem -- importing is a trivial task.

It the "knowing which records to import and which ones to leave alone, and which fields in the records to update and which to leave alone" that is the hard part.

The Devil is in the detail.

Posted (edited)

This would not be easy but it would be possible to do via scripted processes in Filemaker.

Create an audit trail process that stores all changes at the field level in a separate table.

- Creates would generate a record for each field that contains any data

- Changes would only generate a record for fields that have been modified.

- Deletes would either generate one record for the record to be deleted or it would store a record for each field containing data for the record that was deleted. It is only useful to store a snapshot of deleted data if you want to be able to reverse changes using a scripted process.

I believe there are a few plugin solutions that could help out with the audit trail side if needed.

Audit Table Fields:

Action (create/change/delete)

TableName

LayoutName

RecordID (Use: create/change/delete)

FieldName

FieldChangeFrom (Use: change/delete)

FieldChangeTo (Use: create/change)

Timestamp

User (optional)

IP Address (optional)

Nic/Mac Address (optional)

Lets assume that you now have a fully functional audit trail that tracks all changes for all tables in a single audit trail table.

To synchronize the data you will have a few more steps:

You will need to merge the audit trail from your remote user's copy into your main copy (only import audit trail records that have not yet been processed).

Create/Run a process that:

- sorts all audit trail records by table/timestamp

- steps through each audit record

- if the audit record is a change/delete, verify the affected record exists by using the record id

- if the affected record does not exist - flag audit trail record as an error

- If the audit record is a create, determine if the record has already been created or if this is the first create for a set of fields that will follow for the current record id being processed.

- if record exists make the change to the affected record

- updates the timestamp/user for the affected record

- flag audit trail record as processed

- step to next audit trail record

Regarding Create: The timestamp should keep the records grouped together in the correct order by record id since you only have a single user you are doing this for.

For multiple users you will need to further sort by TableName/Timestamp/RecordID and make sure that creates all share the exact same timestamp so that they get processed together and you don't wind up having to make your process more complex just to determine if a create already generated a new record or not.

Errors should either stop the process or produce an exceptions report. Errors occur if a change is attempted after someone has already deleted the record...

After all this is done, you copy the main solution back to your remote user's computer.

Edited by Guest
added info regarding create to synch process
  • 2 years later...
Posted

This would not be easy but it would be possible to do via scripted processes in Filemaker.

Create an audit trail process that stores all changes at the field level in a separate table.

- Creates would generate a record for each field that contains any data

- Changes would only generate a record for fields that have been modified.

- Deletes would either generate one record for the record to be deleted or it would store a record for each field containing data for the record that was deleted. It is only useful to store a snapshot of deleted data if you want to be able to reverse changes using a scripted process.

I believe there are a few plugin solutions that could help out with the audit trail side if needed.

Audit Table Fields:

Action (create/change/delete)

TableName

LayoutName

RecordID (Use: create/change/delete)

FieldName

FieldChangeFrom (Use: change/delete)

FieldChangeTo (Use: create/change)

Timestamp

User (optional)

IP Address (optional)

Nic/Mac Address (optional)

Lets assume that you now have a fully functional audit trail that tracks all changes for all tables in a single audit trail table.

To synchronize the data you will have a few more steps:

You will need to merge the audit trail from your remote user's copy into your main copy (only import audit trail records that have not yet been processed).

Create/Run a process that:

- sorts all audit trail records by table/timestamp

- steps through each audit record

- if the audit record is a change/delete, verify the affected record exists by using the record id

- if the affected record does not exist - flag audit trail record as an error

- If the audit record is a create, determine if the record has already been created or if this is the first create for a set of fields that will follow for the current record id being processed.

- if record exists make the change to the affected record

- updates the timestamp/user for the affected record

- flag audit trail record as processed

- step to next audit trail record

Regarding Create: The timestamp should keep the records grouped together in the correct order by record id since you only have a single user you are doing this for.

For multiple users you will need to further sort by TableName/Timestamp/RecordID and make sure that creates all share the exact same timestamp so that they get processed together and you don't wind up having to make your process more complex just to determine if a create already generated a new record or not.

Errors should either stop the process or produce an exceptions report. Errors occur if a change is attempted after someone has already deleted the record...

After all this is done, you copy the main solution back to your remote user's computer.

Thank you for your input. I am just wondering if it is the master i want to change and the clients I get to update when internet is available. Is there anything I need to change? Thanks.

K.C.

This topic is 5068 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.