January 17, 200619 yr Newbies I am developing a database with about five tables, each of which has a unique serialized primary key by which the various relationships function. Everything works great. My problem is that a group of users will be entering data into a remote copy of the database. After several months, this new data will need to be imported into the primary database. But, of course, since both copies will be creating serialized primary keys, there will be duplicates in the remote copy that need to be systematically changed as part of the import process. Surely this is a common problem. Can anyone point me in the right direction? Perhaps an example script somewhere? We've yet to deploy, so if there is anything I need to add that will make the process easier, I'd love to know about it now! Thanks Brad
January 17, 200619 yr Hi Brad, I think I'd simply assign each set of databases a different prefix to distinguish between them, like having the local version start with "A" and the remote version start with "B". If the records are ever merged, they will maintain their uniqueness and no changes will be needed at import.
January 18, 200619 yr Author Newbies Elegant. Just thinking about the process and problems involved in renumbering everything on import was making my head swim. Why didn't I think of that? Thanks! Brad
Create an account or sign in to comment