Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 7403 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

What I am trying to do is import data from a csv file into FMP6 file using a unique sku number. The problem is that the import csv file has multiple items with the same sku, also the FMP6 file has multiple items with the same sku. Here is an example of the problem the csv file has 6 items with sku 2056 but in the fmp6 file there are 12 items with sku 2056. I need for the csv file to match sku on only 6 of the items and then map over the data only to 6 of the items in the FMP6 file not all 12. It does not matter which six items of the 12 it maps over. I just cant wrap my mind around the answer. Please help? Thanks for reading

Posted

I'd suggest importing the data into an intermediate FMP file. Once in this file you can process the records and wrangle the data into the right shape for import into the main system's master and related files.

Posted

Thank you for the post Vaughan. I am still a bit confused. Once in this intermediate FMP file how do I exactly wrangle the data so that I can match sku on only 6 of the items not all 12? Thanks for all your help!

Posted

You're going to have to do something to make this work; it's a little outside relational logic. Sku is not unique in the file you're importing into.

One method is to do a "serialize by category" routine on your data. The .csv would need to be imported into a processing file first (for any method), 'cause there's little can be done with the .csv as is.

Basically this would be a loop that, after sorting by sku, would go through checking for duplicate sku's, adding a sub-sku number to duplicates, otherwise the sku. They'd look like:

1020

1021_1

1021_2

etc.

1022

If you did this in both files, then an Update Matching Records, matching on that ID would work. Show All in the main file before importing.

Alternatively, since you don't care which one of the multiple original matching sku's gets updated. You could run a script that set the matching sku, if there was only 1 in the import file; but ran a subscript if there was more than 1.

Sort the records by Sku, and Loop to process. If there's only 1 of that sku, Count( Sku self-relationship::sku), set the data into their related record in Main via the relationship.

If there's more than 1 of the Sku, then run a subscript. The 1st step for each group would be set the data (as above), then go to the next record, get the data of that record (into a global). Then Go To Related Record [show, "sku"] to the main file.

You've already set the 1st record in Main, so omit it. Go to the next, see if it's the same Sku. If so, set the data. Back in the Import, omit the record, go to the next. See if it's the same sku. If so, get the data, go the Main file, set the value, omit the record, go back to Import, omit the record, until you run into a new sku.

[i may have omitted twice. I'd really have to do it to set the Loops up right. If you have Debug Scripts, use it. But it would work :-]

As soon as you get to a new sku group, run the process again. It's more or less the way you'd do it manually.

If you're dealing with a small to medium Main file, I'd go for the 1st "serialize by category, update matching" method. If it's really huge, and the import file is fairly small, I'd do the "loop ping-pong" method :)-), which would be faster. Update Matching on a huge file can be slow.

Posted

Fenton thanks for the comments. I am having problems with the serialize by category. I am tring to add sub-sku number to duplicates but I have only been able to add a sub-sku to all sku's. I know I need to search for "!" to find duplicates then add a sub to each of the duplicates but how can I just add a fresh set of sub sku to each duplicate sku. They'd look like:

1020_1

1020_2

1020_3

1021_1

1021_2

1022_1

1022_2

But they actually look like:

1020_1

1021_2

1021_3

1021_4

1022_5

1022_6

1023_7

How can I get the the serials to start over on each new duplicate record?

This topic is 7403 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.