Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 4792 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

I have been reading and think I am clear on creating paths for export. Been practicing imports also. I can use a variable to say the file name but...

I will have files for import where they are identical but first three characters of the file. How can I write generic import if it insists on my specifying the name? I wish I could say to import ALL files in a directory. I will know all but first three characters (ID of part number) of the file name. I wish I could say, "tell me all the file names that are in that directory" and the script like a loop or something and grab them all and import them.

And how can I get the file name that I am importing into the imported records so I know where they came from? Please ideas. I don't even know the technical word for what I am asking.

oh and I see I had put originally that I was intermediate. I realize I was very wrong after my last post. I will go change it back to ding-dong or whatever the earlier levels are so please respond like I'm one.

Posted

I mean I don't want ALL - only all I specify with pattern check or wildcard with script or something.

Posted

FileMaker import is, after all these years, a source of endless frustration. You will need to either use a FileMaker plugin such as Troi File, or set up some process outside of FileMaker that, prior to each import, renames each file to the name FileMaker is expecting.

Posted

Hi Charity,

I know you’ve been working on this and I thought this might do the trick. While technically outside of FileMaker, it is actually very easy to implement (to take you up through having your list of file names); one script and one calculation. Just modify the string in pink to your path.

It creates a vbs file (using export script step) and runs it (using Send Event). Send Event creates dir.tab file which lists the files then you import dir.tab as records into an Import table. From there, you can search for *.xls (or any wildcard search) and perform group import on those ‘files’, looping through the 'import' records. Set them with date imported and you have an audit track. It can also be a gatekeeper so that duplicate files aren’t allowed in … just set the String field (which would contain the unique file name) to Unique (see validation tab) and ‘Validate always”.

You can change the paths and file names however you wish (see yellow highlights in example file). For demo, I only changed the path and I hard-coded the result file (dir.tab). This can be ran in Windows\Temp or however you wish. I envision the Import folder being a hopper – copies of files are dropped into it and, if not imported prior, they are processed and deleted from the hopper folder (originals kept elsewhere). Clarification: files placed in this folder can be deleted after import (all within the same script) in same way that it now deletes the test.vbs and dir.tab files). This would keep speed up; robot or Scheduler could check the folder for new files and process them.

You can also accept data with different extensions together. You can search for both xls and csv and run the same import script - just set up two different imports and branch accordingly when looping based upon file extension. Anyway, I was working through some similar issues and remembered this post.

BTW, I can do this with DOS also but I do not know how to achieve the same thing using Mac. It would be ideal to test the platform and run the appropriate script as needed. If anyone knows and could plant those instructions, it would be great. And I kept thinking there were better ways so if there are, please speak up as well.

GrabFiles.zip

Posted

I asked for help but I didn't expect this.

Thank you, Fitch. It seems they should make it easier for us. Oh well I appreciate your idea. I wouldn't have thought of that.

Hi LaRetta, I do not know if we have vb on all of our computers. I hope so because this does what I want and seems simple. We only have excel files right now but I realized that our parts books come csv and that would be very handy to import our parts books into one database automatically for new parts and also as revisions are received by dropping them into a folder. I will let you know if I get stuck on it and thank you for responding to my pm request.

Posted

I would step back and rethink which system is the "master" and which receives exports. Why involve Excel at all, now that you have a FileMaker database? You could, perhaps, drop Excel from the entire workflow.

  • 2 weeks later...
Posted

Thank you bcooney. I appreciate your candor and I'm working on just that. We hope this to be intermediate solution before full revamp. Anyway, Excel files are not company created but rather are received from outside companies. Maybe some day we can change them but who knows.

LaRetta, this is working perfectly. Thank you for such a great solution. The script was tricky since I was working on dataset but wanting to delete original file immediately in the loop which requires no records in dataset to delete the file back off the computer but I finally figured out a way there and back.

Posted

Hi Charity,

Yes, deleting a file using export requires that you have no records in your found set and if you want to delete the original file after each import then this might be easier:

Each layout can hold its own found set so, when ready to perform the export, open a new window, run the omit loop on the current table until you have 0 records, export, close window ... no moving around. The export can be mapped from any layout. :smile:

LaRetta

This topic is 4792 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.