Leb i Sol Posted July 2, 2007 Posted July 2, 2007 Hello everyone, I am compounding 2 fields (date,time) in order to create duplicate records flag. Question: By just doing a simple compound of c_flag = date&time result is eg. 6/25/20079:43:13PM would it be better (or possible) to have the calculation field hold 625200794313PM? I guess I would be looking 'DateToText' conversion. Again, this compound is to be indexed so my line of thought was that text field without "/" and ":" entries would 'index better'. Thoughts? Many thanks!
comment Posted July 2, 2007 Posted July 2, 2007 Why don't you use the Timestamp() function, with the result type set to Timestamp?
Leb i Sol Posted July 2, 2007 Author Posted July 2, 2007 Hi Comment, This gives the same result as the basic compound...not sure if it matters but 'date' and 'time' fields are stored as calc text.(btw thanks for Post#257075) Thanks a bunch!
comment Posted July 2, 2007 Posted July 2, 2007 This gives the same result as the basic compound Not really - the result, if kept as type timestamp, is basically a number (the number of seconds elapsed since 1/1/0001 0:00:00). It is only DISPLAYED as a date & time concatenation. This should not only address your indexing concerns, it will also sort correctly. not sure if it matters but 'date' and 'time' fields are stored as calc text. Yes, it matters. Why is this a good thing? The best way, I think, would be to change the calcs to result in date and time, respectively. In a pinch you could use: Timestamp ( GetAsDate ( PseudoDate ) ; GetAsTime ( PseudoTime ) ) but this relies on the file's date & time formats.
Leb i Sol Posted July 2, 2007 Author Posted July 2, 2007 but this relies on the file's date & time formats. Now it makes sense, thank you. After I ran a few test I realised that my compound wold not be unique enough so I may have to cram in a few more fields. Knowing this definetly helps. Thanks again!
Leb i Sol Posted July 2, 2007 Author Posted July 2, 2007 Hi Comment, Hopefully I can get your attention to this post again. After I have added a few more fields and essentially 'broken' the date/time format the timestamp no longer applies. However, I can still index the fields...but something tells me that this is not a strong index especially when dealing with 100,000 of records. file attached. Any ideas? Thanks everyone for your attention to this post. All the best! TimeCompound.zip
Raybaudi Posted July 2, 2007 Posted July 2, 2007 Hi what about "comp_UniqueID" = RawString & " " & Get ( RecordID ) ?
Leb i Sol Posted July 2, 2007 Author Posted July 2, 2007 still the same isuse of format and ':' '/' characters.Maybe I am just looking at it too deeply as either way it can be indexed. Thanks!
comment Posted July 2, 2007 Posted July 2, 2007 I am guessing you want to flag duplicates based on the exact time, like this? TimeCompound.fp7.zip
Leb i Sol Posted July 2, 2007 Author Posted July 2, 2007 Hi comment, That is exactly what I was trying to accomplish. After running a few examples I was amased that log file (rawstring) has multiple entries to the Millisecond.May have to cram yet another field in there to make a 'unique id' out of it. This is great! Thanks again!!!
comment Posted July 2, 2007 Posted July 2, 2007 Ultimately, the question what constitutes a duplicate is not a Filemaker question - it's a decision you need to make, based on your understanding of the meaning of the data. If you want to force a unique ID per record, use an auto-entered serial ID - this will be "unique" even if ALL other data is duplicated in another record. BTW, there's no need to concatenate fields - you can simply add more matchfield pairs to the relationship.
Leb i Sol Posted July 2, 2007 Author Posted July 2, 2007 BTW, there's no need to concatenate fields - you can simply add more matchfield pairs to the relationship. I am truly Yes, I do have to determine what constitues dups...and FM opened a whole enother dsicussion. Strangely enough the manufacturer of the device that produced these logs(RawStrings) was not even aware (or not admitting) that their 'reporting' was compleatly misleading. After I added the measure of milliseconds I still was able to find skipped heartbeats in their logs and unjustifed duplicates. Sort of re-inventing the wheel. Much oblidged for your patience!
Recommended Posts
This topic is 6414 days old. Please don't post here. Open a new topic instead.
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now