Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×
The Claris Museum: The Vault of FileMaker Antiquities at Claris Engage 2025! ×

This topic is 3506 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

Hi there

I have a central data set which consists of similar data on a repeating basis for all customers.  Data can be grouped by customer, group, status and type.  What I am looking to do is flag the first record of each unique 'batch' of data (where customerID, group, status and type are all the same) and then do the same at the first change in one of the fields mentioned.  This can be used then in a portal filtered by 'Unique=1'.  The data is in the format set out below, with the 'unique' column showing what I'm after:

Hope that makes sense - thanks!

CustID Code Date Status Type Unique
1010 1 31/12/2014 Primary High Level 1
1010 2 31/12/2014 Primary High Level  
1010 3 31/12/2014 Primary High Level  
1010 1 31/01/2015 Primary Low Level 1
1010 2 31/01/2015 Primary Low Level  
1010 3 31/01/2015 Primary Low Level  
1012 1 31/12/2014 Primary High Level 1
1012 2 31/12/2014 Primary High Level  
1012 3 31/12/2014 Primary High Level  
1013 1 31/01/2015 Primary Low Level 1
1013 2 31/01/2015 Primary Low Level  
1013 3 31/01/2015 Primary Low Level

 

Posted (edited)

Hope that makes sense

Not entirely - because you say "customerID, group, status and type" but there is no group field in the table you show us.

In any case, if your goal is to show only unique records in a portal, you should be asking how to do that, not how to flag records. The question then becomes how many records do you expect to have in this table overall (in order to determine if this can be done by portal filtering alone). And which table will be the "parent" table for this portal?

Edited by comment
Posted

Comment, 

Sorry for bad terminology - it should have read customerID, date, status, type.  No group!

I've got around 170,000 records in the table at the moment and this will always be increasing.  The parent table will be the customer table with a layout showing the data portal with each portal row there returning just the top row of a given data set.

The issue is that the data might not always be consistent from customer to customer - for example, a record for Code 1 may exist on one customer, but not on the next. 

The logic I would be to search for a change in customerID, mark the first row, then within any further records for that customerID, search for changes search for changes in the combination of date, status and type, and mark the first row as it occurs.

Glancing at a sample dataset on a spreadsheet makes it seem easy(!) but I'm struggling somewhat  to automate this.

Thanks in advance

 

Posted

I've got around 170,000 records in the table at the moment and this will always be increasing.  The parent table will be the customer table with a layout showing the data portal with each portal row there returning just the top row of a given data set.

​Well, the real question in such case would be how many records will a single customer have - but I guess with 170k and growing being the overall, the answer would be too large for portal filtering in any case.

 

The logic I would be to search for a change in customerID, mark the first row, then within any further records for that customerID, search for changes search for changes in the combination of date, status and type, and mark the first row as it occurs.

​No, I don't think that's a good method, for several reasons. If you want to perform a search, then show the results as a found set. If you want to show records in a portal, then the records should be selected by a relationship, not by a performing a find. In either case, no marking of records should be necessary. Marking records is bad practice: it requires un-marking and re-marking as your data keeps changing, it depends on records not being locked by another user, and it modifies records unnecessarily, when there has been no change in the actual data that the record represents.

With very large amounts of data (I don't think you're there yet), you may consider adding a summary table, where each group would have a single entry and populate this table periodically, adding entries for groups that aren't going to change anymore.

 

 

 

 

  • Like 1

This topic is 3506 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.