February 8, 200421 yr I have a large central database on a server running FM server and shared with many clients. Records are entered into this db by clients in batches at irregular intervals. I want the records in the main db to be regularly sorted by date and time in, which sounds easy but it's not. If a client runs the sort, it takes a long time, and does not permanently change the sequence of records in the main db. Ditto for a slave running an automated script. In these circumstances, the client who ran the sort will see the data in the proper order, but the next client will not. The relevant fields in the main db are indexed. How do I write a script that will sort the main db in a way that is "permanent" and will be seen by the next client to connect?
February 9, 200421 yr You cannot "permanently" sort the database apart from exporting all records in sorted order and importing them back in. Not something trivial: it can screw your relationships big time. Most solutions I can think of involve taking control of the user interface and strictly limiting that people can do. This will mean probably hiding the status bar and making custom navigation buttons etc. Any sort process gets slower as the record count increases. One technique is to only allow users to see small sets of records at a time (last week or last month, for example) something that can be done quickly. Platform: / Version: FileMaker Version:
February 12, 200421 yr Author Thanks. Not quite the answer I was hoping for, but you confirmed my suspicions. This is quite a pain for a large database (50,000 records or more) that steadily grows.
February 12, 200421 yr You can permanently resort records relatively easily, if you take advantage of the fact that FileMaker sorts duplicates together if you start with a uniquely sorted set of records. So you can loop through a uniquely sorted found set and duplicate all records, overwrite any fields that may be different due to an auto-enter setting, Unsort them to force all duplicated records to the bottom of the list, Omit half of the records, starting from the 'middle' one, and then delete all remaining records. Of course, you would only want to do this when there are no other users browsing your found set. There is an example file I posted in the Samples forum some time ago. It doesn't incorporate the auto-entry portion and assumes a user-defined resort, but I can update it and email a new copy to you, if you would like it more suited to your need. If you have any related questions, let me know.
February 12, 200421 yr a great solution, Queue, I must say. Only thing is, I'd be too scared of the potential for errors in a live db situation. Also an example where perhaps a little knowledge is a dangerous thing, as without due care a novice might well wind up with a mess, or loss of records, or data integrity. But maybe that's just my overly cautious nature rearing its head. Like I said, a great solution in careful hands! cheers, Wendy
February 12, 200421 yr You helped me remember another 'gotcha', too, Wendy. If the records being deleted are linked to another file via a relationship with 'When deleting a record in this file, also delete related records' checked, then the key fields would need to be nullified to prevent unintended spontaneous deletion of child records. Even more so if those child records have child records related to them with 'When deleting...' checked. It could be a cascading nightmare. This is one reason I don't make use of that option very often. It's more dangerous than helpful. Scripting deletion of related records is a much safer way to go... But, you're right, if all precautions are taken, it can be a very useful solution.
Create an account or sign in to comment