Jump to content

Autmatically Updating

This topic is 7031 days old. Please don't post here. Open a new topic instead.

Recommended Posts

  • Newbies


This is my first post (but certainly not my first "That's EXACTLY what I was looking for!!"), so, hopefully, I will do it properly.

Here's the situation: I have desinged a database for the school at which I teach. The main database only needs to have each student's ID number, and then relates that to a number of other databases.

Obviously, the student population changes quite a few times over the course of a year - new students, students leaving, etc.

The people who use the database need to have it updated about once every two weeks (they probably wouldn't mind more often, but don't like to ask, because, as of now, I have to update "manually"). They especially need new kids to show up.

The data that I use to import into the FM database comes in a comma-delimited text file. What I currently do is generate a new file from the school's mainframe, and import it into a clone of a FM database that already exits, called NAMES. I then do a look-up with the MAIN database, and move into a field called MATCH either the student's ID (if they already exist in MAIN) or "ERROR" (if they don't). I then look for all the "ERROR"s in the MATCH field, and then go to MAIN, and import those records - these are the new students.

I then go to MAIN, and do a look-up iwth NAMES. I move into a field called MATCH either the student's ID (if it exists in NAMES) or "ERROR" (if it's not in NAMES). I search for the "ERROR"s, and delete them - these are the students who have left.

Obviously, this works - but so do I! So, I was wondering if someone might be able to help me 'automate' the process. Here is what I envision:

The main secretary generates the comma-delimited file from the mainframe, gives it a specific name, and sends it to the two different secretaries who make use of it (we actually have two different databases, for two different puroses, that are based on this file).

They put it in the folder containing their FM databases, start up the MAIN database, click on a button which then performs a script that automatically does all that I now do manually!

Possible? We are working on PC's (so no AppleScript!) and with FileMaker Pro 5.5 (due to licencing agreements).

Any help would be greatly appreciated!

Tim Hommel

Holy Names High School

Windsor, Ontario

Link to comment
Share on other sites

This should be posible using FMP scripts.

I think I would use a different set of operations.

Copy the current file to a temporary file.

Delete all records in the current file

Import the comma-delimited text file into the current file.

Import the data from the temporary file using the match option and do not allow creation of new records.

I think this will produce the results you want with a less effort and each step can be scripted.

Try it manually first on a copy of the file, then make the script.

Link to comment
Share on other sites

  • Newbies


My first reaction was that this wouldn't work - I can't delete all the records, because they contain info that needs to be 'continued' after the update.

Of course, once I read the entire message, I realized that this couldn't possibly be the solution - it's TOO SIMPLE!

I can't wait to get to work tomorrow and try it out. I think you may have saved me a HUGE amount of time!



Link to comment
Share on other sites

  • 2 months later...
  • Newbies

Well, this process (slightly modified for unique situations I ran into) works! Thanks a million!

Now, I have another problem. Each record (students) has to have 37 dates associated with it (37 text fields, containing something like Sep 7, 2004, except each field contains a different date). These dates change each semester.

I've figured out how to change them all at the end of the semester - it takes a while, because I type the new dates into one record, and then run a script that does a "Replace" for each of the 37 fields. That's acceptable, because the dates are changing for ALL records (students).

But, when I update DURING a semester, only NEW records (students) are missing the 37 dates (because the 'old' records would have the dates pulled back in when I import from the temp file).

Right now, to make life easy for myself, I simply go to the first record with a date, and do a REPLACE for all records, for all 37 dates (using the script I mentioned above). However, this means replacing 37 fields in over 1300 records - over a network! VERY time consuming. What I'm looking to do is just put the dates into the records that are missing them - the 'new' ones (there may only be 5-10).

I tried using Global fields for the dates, and then using calculation fields for the 37 fields in each record - just make them equal to each of the global fields.

However, this means that my date fields cannot be indexed, which I need, because I use them to create a value list!

I seem to be at an impasse. Once again, any suggestions from the experienced people here would be GREATLY appreciated!


Link to comment
Share on other sites

Replace only operates on the current found set. So you just need to isolate that set first.

Make a note of the record number of the first record with a date. Then go to the first record and use Omit Multiple... to omit all the records before that one. Now use your replace script. If your script has a Show All Records, you'll need to remove that step so you can keep the current found set.

There are a couple of ways you could script the finding of one record with a date plus all the records with no dates. One way would be to search for all dates (using e.g., >1/1/1901); then Omit (this omits one); then Show Omitted. Then you're ready to Replace.

Link to comment
Share on other sites

Thomm -- I have a similar situation where I need to update from a mainframe database, and several excel and csv files.

I created a script step for each record, using the import records script commands. I then created a master script step that simply called one after another. Overnight, I use the XP scheduler to open a file that calls the master import script step. It takes about 45 minutes overnight, but it works.

Also, if you have ODBC access into the mainframe database, you can execute an sql to get the student records that you run out to a csv file. In that case, updating is as simple as clicking a button.

Link to comment
Share on other sites

This topic is 7031 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.