CHylton Posted December 1, 2005 Posted December 1, 2005 Hello all! I have looked over this site and could not find anything to give me a clue as to how to tackle this problem. I have an application in Version 6 (not planning on upgrading anytime in the near future). Here is the story: I am getting a data file from another database with 6 fields (filename, item code, upc, description, category, subcategory) to import into my FM6 database. The text file is tab delimited, with a comma delimiter allowed in either the UPC or Item Code field. I have put the UPC and Item Codes into another text field in the FM6 database called Product Code; this field is stored by concatenating each individual code with a CR/LF. So it looks like this: Data in text file: Item Code: 12345, 12346, 12347 UPC: 12345678977 Data in Product Code Field: 12345 12346 12347 12345678977 That was the easy part! What I need to do now is pad each of those entries with zeros to the left to form a 13 Digit GTIN (no check digit stored). So, what I would end up with would be the Product Code field with the following data: 0000000012345 0000000012346 0000000012347 0012345678977 The Product Code field can have no data or multiple entries; no way of telling how many codes are in the field. Any ideas on how to best accomplish this task? TIA Chris
comment Posted December 1, 2005 Posted December 1, 2005 I believe you'll need to break out the values into a repeating calculation field (see here how). Then add padding to the extracting formula.
CHylton Posted December 1, 2005 Author Posted December 1, 2005 Thanks comment. That is kinda sorta what I am looking for. The only thing is that the example assumes only 10 repetitions; what if there are more than 10?
comment Posted December 1, 2005 Posted December 1, 2005 No, the example assumes UP TO 10 values in the source list. You set the number of repetitions to the maximum expected values in the list. Then adjust your calculation to return a result only when a corresponding value actually exists.
CHylton Posted December 1, 2005 Author Posted December 1, 2005 Thank you comment! With the lead you gave me, and adding a looping script into the mix, I now have the data I need. I run this weekly on a 20,000+ record UniData Database, which I then put into my FileMaker Solution. The old way usually took 4+ hours to run; it is now down to about 2 minutes! Thanks Chris
Recommended Posts
This topic is 6931 days old. Please don't post here. Open a new topic instead.
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now