Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×
The Claris Museum: The Vault of FileMaker Antiquities at Claris Engage 2025! ×

This topic is 6769 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted (edited)

Hi,

Just simple question.

How can I achieve a primary key compose of many fields?

My goal is to avoid doubled record in the table. The user should not be able to enter twice or more, the same record but the data in one field can be reused in another record. The table is composed of 5 or more fields:

Tab1

Id_tab

Year

StartSession

Duration

Description

The primary key is supposed to be composed of Year + StartSession+Duration – it’s unique but separately the data of one field can be reused in another record:

2006 01/01/2006 10 Regular

2006 05/01/2006 15 Summer

……

I did unique all of the fields which composed the pk but when I enter a new record and start to populate the fields one after another, FM warns me, example: this field it’s should be unique do you want to override – I say yes since the year 2006 can be reused and pass to another field. Finally, when the record is entered, it’s no more unique. Achieving this is easy with access/sql server or oracle, but how can I get this done in FM8 ? FM doesn’t use events like access that are triggered and can be used to control different stuff, layout it’s always tied to the table.

Maybe there’s another way to avoid doubled record at the time of entering?

Any idea ??

Edited by Guest
Posted

You do not need to have your primary key do double-duty like this; in fact, it's a recipe for you to lose your hair (or turn it grey).

Rather than using a complex formula to generate your key field, instead set up a separate serial number primary key that is auto-entered and never changed or seen.

You can present the user with a calculated field that looks like you've got it--just don't make it the key field or build relationships on it.

If you need to be sure that the combination of entries doesn't repeat, use a validation routine to check for duplicate entries, either scripted or via self-join relationships.

HTH,

David

Posted (edited)

Thanx but … I do have serial number pk in this table, not shown here, but it doesn’t protect form double record. Let’s say I have this table with records

Tab1

Id_tab (serial number – auto-entered)

Year (date)

StartSession (date)

Duration (number)

Description (txt)

….

and the records:

Id Year StartSession Duration Desc

1 2006 01/01/2006 10 Regular

2 2006 05/01/2006 15 Summer

Now there is nothing to protect me from entering another time the same entry even if the fields: Year, StartSession, Duration, Desc are set up unique in the field’s definition. All together are unique – Year + StartSession + Duration + Desc

example 1 (this is not correct table - double rec.)???

Id Year StartSession Duration Desc

1 2006 01/01/2006 10 Regular

2 2006 05/01/2006 15 Summer

3 2006 05/01/2006 15 Summer

but I can have this (this is correct table – no double rec.):(

example 2:

Id Year StartSession Duration Desc

1 2006 01/01/2006 10 Regular

2 2006 05/01/2006 15 Summer

3 2006 05/03/2006 15 Summer

The rec 3 is different from rec 2.

Maybe there’s something that I’m missing here since I’m knew in the world of FM. How come self-join relationships can protect me from double or more records ??

Though, the problem is how to avoid double records at the time of entering.

Edited by Guest
Posted

David gave you the answer already:

If you set the id field to auto-enter a serial number, do not modify, Filemaker will take care of the unique issue for you for incidents of 'simultaneous' creation.

You will need to be aware of the issues involved with imporing and record duplication as well.

-Raz

Posted (edited)

Wrong. The field ID in this table is unique, indeed, but what I’m looking for is the uniqueness of DATA in the record and not the ID field itself. Just with one field I cannot achieve the uniqueness of the data, I need more fields to compose the primary key to be sure that record entered is unique. In reality, the field ID is not necessary in that case. With Access you can choose more than one field and to compose the pk.

Please, have a look at the examples 1 and 2.

Example 1, records 2 and 3 – the ID is unique but NOT the DATA.

Example 2, records 2 and 3 – the ID is unique but the DATA either

Edited by Guest
Posted (edited)

sotiris_2--

I'd just like to point out that your terminology here is confusing. Your subject line says "Primary Key composed"--but you state that this calculation you're doing is NOT the primary key. Some of the confusion here is due to that fact.

What you're really referring to here is a question of data validation, which is a different question. Your business rules state that a certain combination of data elements must in aggregate be unique. That's not the same as having a unique primary key field (although it is similar).

Comment has referred you to one way of handling the sort of validation you're seeking; another way that is commonly offered is to use a script to validate your data, either with or without the use of global fields as interim data stores. [in this approach, you create a set of globals to hold the data, check them using a script, and write the record after the data checks out okay. Given the amount of work involved, I like the self join approach, not that I've used it]

Comment--In looking through that other thread, I remembered I had wanted to ask you whether you had observed any negative performance issues with using the self-join method to validate? Does this degrade in large data sets? I ask because I have seen slower performance for example when I use a GTRR instead of scripting a find.

Also--I noticed that when I use the example you provided in a portal situation, the validation trigger didn't occur until the parent record got committed. In other words, I open a client record with a portal for appointments. I click in the portal and add a particular date and time, and then go to the next portal line, enter the same appointment info, and repeat this as many times as I want. The validation doesn't trigger (in my test, anyway) until I commit the person record--at which point it triggers for each duplicate in the portal. What has to be different?

David

validate.fp7.zip

Edited by Guest
Posted

Any test of uniqueness will get slower as the data set grows - but I have a reason to believe (and no good way to prove) that testing for uniqueness in a large number of small data sets (many parents with few children each) remains fast.

Yes, sadly, children in a portal are committed/reverted together with the parent, and I don't know of a good way around that - other than opening a new window for entering the child data (and also committing the parent).

Another caveat of this method (as described in the other thread) is that a child is related to itself, so re-entering the same data triggers the validation. This can be solved by adding Child::ChildID ≠ Duplicate::ChildID to the relationship's criteria.

Posted

Thank you guys,

I’ll try something maybe the way with the popup window. It really sacks when you don’t have events to control what you’re doing. This problem is easy to resolve with Access/Oracle/MS SQL but here that’s not the case.

I just started with FM and I’m forced to do something.

Ps

David – your file is corrupted

This topic is 6769 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.