Jump to content
View in the app

A better way to browse. Learn more.

FMForums.com

A full-screen app on your home screen with push notifications, badges and more.

To install this app on iOS and iPadOS
  1. Tap the Share icon in Safari
  2. Scroll the menu and tap Add to Home Screen.
  3. Tap Add in the top-right corner.
To install this app on Android
  1. Tap the 3-dot menu (⋮) in the top-right corner of the browser.
  2. Tap Add to Home screen or Install app.
  3. Confirm by tapping Install.

Bringing Tables Together?

Featured Replies

Hello Everyone!

I've seen some questions here that address similar concerns, but I can't quite find the answer to this problem:

I have a table listing demographic information about students. Each student has a unique ID, which I am using as a key. This table has several hundred fields. So far, so good.

In addition, I have a series of about 20 additional csv files that include credit enrollment information for each student. These are keyed by student ID also. There are about 150 fields in these files. The first six fields contain data unique to that table (student program and type as of the date that table was created, etc.). The final 140 or so fields are in the same order in each file, and have the same information in each file. Because of this, I need to import those 140 or so fields from the most recent file only.

I would like to pull all the information in the series of 20 credit enrollment files into one table, so that I have a series of records keyed by student id, each containing all the information about any particular student. Conceptually, I think I understand how to do this: Import the files one at a time, being sure that I am matching the appropriate fields for those first six variables at each import.

However, in actual practice, this is a nightmare. With so many hundreds of fields, it's very hard to match the fields. There's no easy way to check for errors, since values can and do jump around (in other words, it's not as easy as it would be if I could tell at a glance whether a last name was matched to a field for address).

How can I streamline this process? Should I import those 20 or so files as separate tables, then try to combine them? If so, how would I go about that? Or should I just slog through the extremely tedious--and error-prone--process of moving those fields around, one at a time. . . ?

Ho Charles!

A comment and some questions:

In a normalized solution, most tables have under 30 fields. Back-end tables might have 10 or fewer fields, whereas hub tables with complex interfaces might have a couple hundred fields. My thought is that there may be some optimization possible by normalizing your tables further.

Now, can you tell us more about those "20 additional csv files"? Are there 20 files per Student? What distinguishes them (and why is there 150 fields in there)? Is this a one-time import, or will you need to do this periodically? What's the source of this data?

  • Author

Hey Ender! Thanks for the quick and helpful response.

I will comb through that first file with all those fields. Some of the data bits are probably not necessary (several slightly different field for home country, for example).

On to your bigger question: Those additional 20 csv files are a bit odd. There are 20 total, with a record in each for each student. The first 6 fields in each of the 20 files are different from file to file, and represent information current as of a specific point in time. I need to capture that information, but the remaining fields are identical in each file, so I need to capture those only once. Here's a rough illustration:

File #1:

Field A: needed

Field B: needed

Field C: needed

Field D: needed

Field E: needed

Field F: needed

Fields G-End: Need to capture only from File #20.

File #2:

Field A: needed

Field B: needed

Field C: needed

Field D: needed

Field E: needed

Field F: needed

Fields G-End: Need to capture only from File #20.

In theory, I guess I _could_ manually drag all those field definitions around at import, but that would be a monumental pain, and would likely induce errors.

Should I strip the unneeded fields from files 1-19 in Excel, then export them as csv files with just the needed fields? This seems like a very labor-intensive process, too, and I would rather let FM do the work.

By the way, this massive import is a one-time process. After this, I'll need to import new files one at a time, every few weeks.

Create an account or sign in to comment

Important Information

By using this site, you agree to our Terms of Use.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.