Jump to content
Server Maintenance This Week. ×

Any Logging Features In 7?


This topic is 7211 days old. Please don't post here. Open a new topic instead.

Recommended Posts

  • Newbies

I know they beefed the securtiy features in 7 but I was wondering if theres any logging..

Example

User_______Record______Field They edited______from____________to

---------------------------------------------------------------------------------------------

Me_________1___________Name______________Myself__________Me

(Sorry for the _______ The forum takes out spaces)

Anything like that going to be in FM7 or Server?

Link to comment
Share on other sites

AvrioTech said:

Ya I built one in FM6 but should be easier now in FM7 to do. I was hopeing I wouldnt have to do it at all tho smile.gif

It is, as you say, easier to do in FMP v7. The example that Bruce referred to shows a way that an unlimited number of input fields can be logged in v7 with the addition of only two fields per table. ooo.gif

Link to comment
Share on other sites

  • 4 months later...

I implemented logging on my db today, based on cobalt's audit example (Nightwing Enterprises). Just a quick follow-up comment on the Cobalt's approach, in case you are thinking about implementing one

Cobalt uses 2 fields for logging: log_txt & logFeed_txt. The logFeed_txt field is only for holding the previous value of the field.

If your table has large number of fields (over 30 fields), the speed for the layout with that table will decrease significantly because of the calculation in logFeed_txt. In a table with 30 fields, after changing one value, it takes 3 seconds for log entry to finish processing smile.gif

It is faster & more efficient to take out the logFeed_txt and modify the calculation in log_txt to exclude the logFeed_txt field. It is not necessary to keep track of previous entry because the log itself already hold that information.

Link to comment
Share on other sites

Interesting comments H.P. But that is not my experience.

The demo file itself is set up to log 80 fields and as you will have seen, it updates without any perceptible delay. In another implementation here, 240 fields are logged in a single layout and the logs update in well under half a second - nothing remotely approaching the three second delay you are reporting.

I suspect that there are other things about your implementation that are causing the delay you are seeing. At a wild guess, perhaps you included some unstored calcs or summary fields among those referenced in the formula for the log feed calculation (only data entry fields should be included). If not this, then some other anomaly will account for the extraordinary delay you were seeing.

As an aside, the two-field architecture has a natural limit of around 250 input fields (depending on the data types of the fields and the length of the field names) before the 30,000 character limit on the length of a calc expression is reached. An alternative architecture is therefore required for implementations which have more than around 250 fields per table. However this is a structural issue and is unrelated to performance.

Link to comment
Share on other sites

You are right that calculation fields are the main cause of the problem. The table I'm logging has quite a number of unstored calculation fields and I was testing with over 10,000 records. It is much faster when I excluded those fields.

Because I have to log everything on the table, I am forced to include those fields in the log. In cases that it is necessary to include calculation fields or if the solution is slow with the log, I think it is still better to exclude the logFeed field.

Also, as a side wink.gif If anyone need to log a large table with huge number of fields, instead of manually typing the name of the fields in the script, you can generate all the table names in the format needed in the calculation fields by using a script, looping through all the field names on the tables or layouts (fieldnames function), and set the format as required in the log_txt or logFeed_txt fields. This should save you a lot of time.

Anyway, thank you for creating a great example for us to copy wink.gif

Best regards,

H.P.

Link to comment
Share on other sites

Hi H.P.,

The AuditTracker method (and others like it) are constructed to log user inputs only - by definition they only capture data relating to the field the cursor is in when an edit occurs.

Since calc and summary fields cannot be edited directly, the cursor cannot *ever* be in a calc or summary field when an edit takes place. Ipso facto it is not possible to 'log' calc or summary fields by this method. If you include them in the log calcs they will slow the process down to no benefit, since they won't ever appear in the log output.

In fact, since calcs and summary fields are derived data, if you capture the user input values you should always be able to reconstruct the corresponding derived values.

If you really need to 'log' the derived values with each edit, then a logging mechanism could be built to do that (but it should draw the values off directly, not via a feeder field). However there will inevitably be some performance penalty if there are complex unstored calcs being logged, since the logging process will require that they be evaluated more than once with each edit. Some care with optimisation (of the original calcs and the structure as well as the logging mechanism) could make a considerable difference to performance with such a set-up. wink.gif

Link to comment
Share on other sites

This topic is 7211 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.