Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 8742 days old. Please don't post here. Open a new topic instead.

Recommended Posts

  • Newbies
Posted

I have a file of 4000 records that I would like to split into 80 files of 50 records each. What is the easiest way to automate this? Thanks.

Posted

That's a good question, mwiedemann.

Notowitz, only break up databases if the data structure requires it. If the data structure of the original file is poor then by all means create a new design and do whatever is required to make it better.

If you don't know what I mean by data structure, then don't change anything until you do some homework.

We get a lot of questions here along the lines of:

Q: My company has 40 divisions, I have a separate databases for each one. It's worked really great until now when I want to get statistics out of all 40 databases and combine them into one report. How do I do it?

A: Join all 40 databases into one big one and work on all the data at once.

So don't split them up just for the sake of splitting them up. It might make doing something else in the future difficult or inpossible.

  • Newbies
Posted

The original database is working great and won't be touched! Sorry for the lack of clarity. So here's a better version of the question. I want to create 80 different lists of 50 each. Why? So I can give 80 different people different pieces of the database information. Somehow I would like to export records 1-50, then 51-100, then 101-150, etc.-- each group of 50 to its own text file. There must be a simple way to do this. Can you help?

[This message has been edited by Notowitz (edited December 12, 2000).]

Posted

quote:

Originally posted by Notowitz:

Yes, that I can do. But i wanted it to be automated to save labor in the long run, when I need to do this again in the future.

Wouldn't it be eaiser to simply have a field saying who the record was assigned to, then assign the records in groups of 50, then have each person find thier records in the main database.

They will only be working with 50 records, so performance should not be an issue.

Why export the data in the first place?

------------------

=-=-=-=-=-=-=-=-=-=-=-=-=

Kurt Knippel

Consultant

Database Resources

mailto:[email protected]

http://www.database-resources.com

=-=-=-=-=-=-=-=-=-=-=-=-=

Posted

Will the database always have 4000 records and you will always be performing 80 exports of 50 records each? Will they always be the same records?

If so, it'll take some time to do, but you can automate this process. Basically, take note of how you find each set of 50 records. Perform that find for the first set, and then create a script with a Find [ Restore ] step in it. When you have created this script, it has saved in it the find request you performed, and if you run it again, you'll get the same set.

Now duplicate that script 79 times, naming each one something like Find Set 01, Find Set 02, ... Find Set 80. The script Find Set 01 is the first one you created.

After you have all your scripts made and named, you need to reassign the find requests to the other 79 scripts. Perform the find for the second set of 50, open ScriptMaker, open Find Set 02, and click OK. You'll be asked if you want to keep the saved find criteria or replace it with the current find criteria. Keep is the default. Click Replace and click OK. Do the same thing for the other 78 scripts and found sets.

I'm assuming that the Export for each one is going to be the same. If you're on a Mac, you can do with with one export script and a little of AppleScript. If not, you'll need to use either 80 export scripts (to name the files with 80 different names), or you're going to need a utility like WinBatch.

For instance, if you're on a Mac you would have a script something like this:

Perform Script [ Find Set 01 ]

Perform Script [ Export Records ]

Perform AppleScript [ tell application "Finder" to set the name of file "export.txt" to "export01.txt" ]

Perform Script [ Find Set 02 ]

Perform Script [ Export Records ]

Perform AppleScript [ tell application "Finder" to set the name of file "export.txt" to "export02.txt" ]

Perform Script [ Find Set 03 ]

Perform Script [ Export Records ]

Perform AppleScript [ tell application "Finder" to set the name of file "export.txt" to "export03.txt" ]

...

Perform Script [ Find Set 80 ]

Perform Script [ Export Records ]

Perform AppleScript [ tell application "Finder" to set the name of file "export.txt" to "export80.txt" ]

If you don't have a Mac or you can't automate the renaming of the files, you have two choices, have the user manually set the name of each export file, which eliminates the computer being able to do it all for you, or have 80 export scripts. Using a plug-in such as Troi's File plug-in may also do the trick.

Chuck

Posted

Why not give each of the 80 users their own custom database file with a script in it that imports their own specific records from the main file? That way, they can get their own records when they need them, and it takes the load off the main database.

1. Run sub-script in main file to search for specific records, using global field or cut/paste to transfer search criteria.

2. Export records to a common temp file

3. Individual's database imports from temp file.

I guess this would be somewhat messy too, but it doesn't look like there is any neat solution here.

[This message has been edited by BobWeaver (edited December 15, 2000).]

This topic is 8742 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.