Jump to content
Server Maintenance This Week. ×

dealing with Special Characters


This topic is 7004 days old. Please don't post here. Open a new topic instead.

Recommended Posts

I'm pretty new to this, but why don't you just set up a looping script to read each record in turn and then concatenate the contents of the fields into a global field. You can use Substitute() functions to deal with the special characters by replacing them, as well as calculations to create a custom delimiter.

On import, you can import the entire contents of your export into a single global field, and then using a calculation that goes through the data and then directs the data using your custom delimiters as the map.

Use a step method that tells FM to read the contents of the global field starting at one custom delimiter and ending at a second delimiter. Included within those start-stop delimiters is all of the data you want for the new record in the new database with custom delimiters for each type of data. The script reads the data, extracts each bit of data in turn and populates the fields in each table according to how you tell it to treat each type of data based on how it reads your custom delimiters. Using Pattern searches, and other Filemaker functions to parse the data, you should be able to use the custom delimiters to separate every data type from the export so that FM7 can pipe it into the right tables and fields.

You can use multiple scripts and have one script read the imported data and then direct it to another script, passing the data it read as a script parameter. The second script, the subscript would then parse it, use the substitute function to restore any special characters that might have conflicted with your export or importing, and then using Set Field to place the restored data into the target field.

I would recommend that you draw a diagram for the data flow for your import, and that you make sure that you think through how you use develop your relational keys.

One method that might be easier to manage from a visual point of view would be to create a table for the import. Populate it with global fields that correspond to fields in your various tables. Have your script parse the entries in your import, and then fill in the fields, and then have another script generate the relational keys, and then transfer the field contents to their proper tables, set the relational keys, and then reset the global fields, increment the relational key for the next set of data, and have it loop until it is finished.

What you then do is create a set of custom reports to check the bottom lines, sums, and various financial calculations starting with annual, semi-annual, quarterly, monthly, and weekly reports and make sure they match your previous data under the old system. If they match, do a random sampling of the daily data, or plan ahead and have all of the final calculated data included in the import that you want to use to cross check the new database, and use calculation fields to identify differences, and then you can just search for the records that don't match.

Link to comment
Share on other sites

"I'm pretty new to this, but why don't you just set up a looping script to read each record in turn and then concatenate the contents of the fields into a global field. You can use Substitute() functions to deal with the special characters by replacing them, as well as calculations to create a custom delimiter."

If you stop at the point of making the concatenated global field you can just use Export Field Contents to get a usable import file. You do not need to replace the special characters or create another script to parse the data on import.

Link to comment
Share on other sites

"I'm pretty new to this, but why don't you just set up a looping script to read each record in turn and then concatenate the contents of the fields into a global field. You can use Substitute() functions to deal with the special characters by replacing them, as well as calculations to create a custom delimiter."

If you stop at the point of making the concatenated global field you can just use Export Field Contents to get a usable import file. You do not need to replace the special characters or create another script to parse the data on import.

Link to comment
Share on other sites

"I'm pretty new to this, but why don't you just set up a looping script to read each record in turn and then concatenate the contents of the fields into a global field. You can use Substitute() functions to deal with the special characters by replacing them, as well as calculations to create a custom delimiter."

If you stop at the point of making the concatenated global field you can just use Export Field Contents to get a usable import file. You do not need to replace the special characters or create another script to parse the data on import.

Link to comment
Share on other sites

This topic is 7004 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.