Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 7747 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

I have heard that going back to an old backup, cloning it, and importing records from an export of the recovered file is a reasonable way to avoid corruption problems.....

But what if you are constantly updating the functionality of your database (adding new calculation fields, scripts, layouts, reports etc.) and you are not sure when the problems may have started to creep into the file? If you back to an old enough backup, you may be pretty confident, but it may not have your recent structural changes.

Is there anything you can do short of taking all the exported data and rebuilding all the scripts (or importing them?) pasting in all the calculations and layouts (and resizing the layout parts manually)?

Bottom line is I'm just looking for opinions on the best way to fix problems. ...and for the record I have never had to recover a file, but I have continued to use files that appeared to be closed improperly. Any opinions on this practice?

Thanks,

Dana

Posted

Hello Dana,

It's not going to help you at this point, but the best practice is to save a clean clone of the file every time you make a program change to it, and always work on a copy of the previous clone (not a copy of the database that has been live to users). That way you won't ever get into the predicament of the last clean copy not matching the current structure.

But I'm afraid if you've not done that, the news is not good. If you rebuild the file but then import scripts you may be importing corrupt script data or parameters (which may or may not be visible in the script dialogs) - and if you opy and paste layout objects you coudl be pasting a corrupt graphic or other element. So the only sure solutions are either:

1. Repeat all the modifications to bring the last known clean copy into line with the current database structure, or

2. Rebuild from scratch, without taking across any of the program content from the damaged/recovered file.

That way only the data comes across. And even there, I'd recommend that you examine it with an application such as Character Sieve (from Protolight) to look for ideosyncrasies that may not have been addressed by the recovery procedure - before you risk uploading it to your clean clone (or rebuilt file).

Sorry the news is not better.

Meanwhile, the practise of continuing to use files that have been closed improperly is one that does carry some risks. The risks are considerably greater if the file was undergoing structural changes (layouts, define fields, scripts etc) during or not long before the improper closure (as this increases the likelihood that corruption will affect the structural integrity of the code as well as the data.

There's no universal answer for this, in my opinion. It's a question of balancing the importance of the data and the continued trouble-free operation of the file against the resources and potential down-time required to maintain a 100% clean solution. Whatever you, as developer may think about it, the practical realities at the coalface will generally take precedence.

So long as you've kept clean 'developer copy' clones along the way, however, you will always have a secure position to fall back to. wink.gif

Posted

so if I have a clean developer copy, the best way to make that solution "live" is to take the current live database offline, export all the data, and then import it into the updated clone?

Thanks for the advice,

Dana

Posted

Hi Dana,

Unless you have concerns about data corruption and want to examine (manually or via a 3rd party program) the data before uploading it to the clean copy, the best and most efficient path is to import the data directly from the production file(s) to the clean developer clone.

Apart from being quicker, this ensures that container field contents as well as alphanumeric data are brought across.

Aside from that, the other thing you need to consider is updating any serial number (next serial) values. I find that it is generally best to build this into a script (eg using the Set Next Serial Value [ ] command) that can be called at will, so you don't have to wade through all the fields each time making sure you haven't missed any auto-entry serials. wink.gif

This topic is 7747 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.