Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×
The Claris Museum: The Vault of FileMaker Antiquities at Claris Engage 2025! ×

Normalized structure difficult when import and export of data?


This topic is 5334 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

Hypothetical here:

If a database is normalized quite a bit - resulting in many tables to capture the bits of information on say a person, with a table for contacts, table for phones, table for addresses, etc. how difficult is it to import this data when the data comes in the form of an Excel spreadsheet where one row represents all of this information along with multiple instances of some of the entities like phones or addresses (Phone 1, Phone 2, address 1, address 2)?

For example, data from Excel spreadsheet, each row contains ALL the information for one contact (First and Last Name, an address or two or three, multiple phones, fax, a couple of emails, web address, etc.)

I am trying to envision how the import would occur - where this data needs to be imported into separate tables and create the necessary relationships to associate them together. And what about the rows that have multiple phones and some other rows that don't?

I know this must be what some developers deal with all the time and have no doubts FileMaker can accommodate. I suspect there is a cleansing stage when the data to be imported first arrives. Maybe a utility file is built to align the columns (fields) before actually importing into the database.

Tossing out thoughts - do I create a layout that includes fields from the normalized database structure that already exist and import in to that? If any rows have fields with multiple values (like phones) do I place a portal on that layout to accommodate the intake? And lastly, I'm guessing that such an operation needs to be scripted to create multiple phone records for one contact?

As for export, where a similar "One row with all data" is expected on the receiving end - seems challenging as well.

Should I continue to build as normalized as I want and rest assured that you scripting gurus out there have a sure-fire solution regardless of how normalized the structure is and how denormalized the intake structure is?

This has been gnawing at the back of my mind for months - so any and all insight appreciated.

Thanks

Posted (edited)

I suspect there is a cleansing stage when the data to be imported first arrives. Maybe a utility file is built to align the columns (fields) before actually importing into the database.

Indeed yes. I suggest that you use an import table (which is simple table which holds your flat-file information). To create your relationships and add each piece through your tables, it would be individual writing to each table, assigning the parent IDs as you go. You should also keep in mind that it will be important to maintain the source data.

It is very similar to a Transactions file which downloads purchases from online source where each line is a LineItem but you need to create the customer, create the invoice, add the lineitems (and sometimes even create the product). Instead of using an import directly into your tables, use the transaction file to hold your incoming data and set your fields throughout your tables as needed. Maintaining this relationship protects from processing same Transactions twice into your solution.

I've attached a Transactions file which moves data from flat-file to relational. You would of course need to adjust as needed (you may not want to create products and maybe you already have customer base that you will need to match to) but the idea will be the same regardless.

If you have a line which may have two entries, it is a problem but not impossible. As you process each line, you would test the field with ValueCount() and you could store the second value in script variable and remove it when moved into your table. Then duplicate your record in target table and set that field in the second record with the script variable.

This all is certainly not stuff for a beginner but it can be worked through logically and you have no lack of intelligence; you are already thinking ahead. File attached called Transactions because that test file was created in response to downloading online transactions. Note the top table occurrence group is for creating records and the second table occurrence group is your standard structure.

UPDATE: Ideally of course, you should be using ProductIDs online as well as CustomerIDs. I used customer name and product name because it was easier to display through the demo AND because the demo was created from sample data with no base information to begin with.

Transactions.zip

Edited by Guest
Added update
Posted

Thanks LaRetta!

That demo was well done - I understand the jest of it... very clear. The script was commented well - I'm stoked.

Your response puts a lot in perspective and gives me some hope that getting the data in and out is possible - for someone as myself. Working out the logic will be the area that I will wrestle with the most I suspect.

While I do not have a situation I am trying to solve currently - my concern was straddling myself with a burden moving data in and out.

Once again - thanks!

This topic is 5334 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.