Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

Convert repeating field groups to related records


This topic is 7318 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

I have an Invoices db with each line item represented by a group of repeating fields, and I want to convert this to Invoices db + LineItems db -- a simple one-to-many relationship (FM6 all the way).

From reading an earlier thread in this forum, it appears that you import directly from the original Invoices file: once for the scalar values that belong in the new Invoices file, and once for the repeating field groups that belong in separate records in the new LineItems file. If this is the best way to go, I have a few questions:

-- There is currently no "Invoice_ID" field to serve as the key for the relationship. Should I add this to the original Invoices file before doing the Imports, or can it be calculated during the Imports? (I can't perform auto-enter options while importing.)

-- When doing the Import into LineItems, should I select "Import values from repeating fields by splitting them into separate records"? I can't seem to find a complete description of this option in the documentation. Can I designate a non-repeating field (i.e. the Invoice_ID) to be copied into each LineItem record?

Thanks for any wisdom,

Chap

Posted

Yes first thing to do is to stripe your invoice file with a InvoiceID, this might mean that you need another field on you invoicefile. To stripe should all records be the found set, and you make a Replace by entering the field and press cmnd/ctrl + "=" here choose the serialnumbers feature.

Then make a new file called InvoiceItems.fp5 with all the fields that used to be repeating and an extra feidl to tell which InvoiceID it used to belong to. Import to this new file by mapping the repeating fields + the InvoiceID and choose the option "...splitting..."

Then delete the fields that used to be repeating in the Invoice File in the fields def. and establish a relation to the newly created files contents. Then draw a portal with the layout tool and assign it to the new relation and paste each line item field into the portal without touching the borders. The calc'field from earlier on won't import so you need to make a change to such fields in the new file to do similar.

Finally did you previously have some sums of fields from the repeats, which now have lost their references. Redefine these to Sum( over the now related linesums!!!

--sd

Posted

Excellent writeup. Thanks - you covered all my points. That's exactly how I ended up doing it, except I didn't know I could use Replace for the auto-serializing. Glad to know I was on the right track. :-)

One more question: after the Import, what's the best way to update the Next Serial Value for InvoiceID, given a textual serial number of the form "M000001"? Here's what I did:

- - -

Sort // on InvoiceID, ascending

Go to last record

#

# Because of alpha in serial number I can't use arithmetic in next step

#

Set Next Serial Value[invoiceID, InvoiceID] // set equal to highest

New Record // temp record, side-effect is to bump Next Serial value

Delete Record // get rid of temporary record

Unsort // return to original order

- - -

Chap

Posted

You could create a constant self-relationship (from an auto-enter or calculation of 1 to itself) sorted by InvoiceID descending. Then relationship::InvoiceID refers to the largest InvoiceID. So,

Set Next Serial Value [invoiceID, "M" & Right( "00000" & NumToText(TextToNum(relationship::InvoiceID) + 1), 6 )] should work.

Posted

You could create a constant self-relationship (from an auto-enter or calculation of 1 to itself) sorted by InvoiceID descending. Then relationship::InvoiceID refers to the largest InvoiceID.

That's an interesting approach. Is its chief advantage in not messing up the sort sequence of the current found set?

Posted

I think its chief advantage would be that it can be performed from any record and doesn't require a sort or Go to Record step. So it may be more seamless to the user and possibly faster, though I haven't tested it speed-wise.

This topic is 7318 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.