Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 6584 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

Yes two succesively performed imports, into an auxillary table still inside the solution, and back again the source table - while remembering to delete all records in the auxilary table.

But by and large is it structurally wrong unless it's historical data, where it is posible to publish the same data from portal to portal by stacking all the foreign keys as a pilcrow delimited list, where the foreign key is made a text field.

--sd

Posted

Hi Tom, and welcome to the Forum.

What you are wanting to do can probably be accomplished with a script, but it might be better accomplished using a different approach. Why not tell us what you are trying to accomplish. I.e., what do you have now, and what do you want as your end result. It might also be helpful to us and you if you attached a copy of the involved.

HTH

Lee

  • Newbies
Posted

Thanks, Lee. I'm building a simple task management file. The projects we do often have all the same tasks. So, I'd like to copy all of the tasks in one swoop instead of duplicating each task.

Posted

I would use a script. Depending on how many fields, and how many records I needed to move the data on, would dictate which steps, and how I would move through the record or records.

1). Is it only the current record being Duplicated?

2). Is it all fields on the record, or just a few? (i.e. 10 fields in the record, need the data for 1, 2, 5, etc.)

3). Is it the same data for all records? (i.e. duplicate one record several times).

HTH

Lee

Posted (edited)

copy all of the tasks in one swoop instead of duplicating each task.

Unless this is pages of data is there nothing wrong with looping - it hardly ever gets near a daunting task to await completion of.

Take a look at the Renewal template I've made in this thread:

http://www.fmforums.com/forum/showtopic.php?tid/175845/post/228545/hl//

But still are Lee's questions utterly relevant, since we wish to teach normalization as often as posible. If you deal with historic data is real duplication king, but often could several keys be stacked in the foreingkey field ...to optain better integrity.

So if you think that you can learn to normalize later, and your records not are part of any portal or such, could you think of the algorithm in the image above this post.

--sd

Billede_1.jpg

Edited by Guest

This topic is 6584 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.