Jump to content
Server Maintenance This Week. ×

Suggestions for restructuring DB


This topic is 4475 days old. Please don't post here. Open a new topic instead.

Recommended Posts

We have a "Daily Log" database at work that's a little buggy, that I'm thinking of restructuring. I have a few ideas about how, but I'd like to get some suggestions before I start.

The database is used to log events in an industrial plant. Log entries are made for different pieces of equipment when they are started/stopped or maintenance is performed. The main table looks something like:

Daily Log

Date

Time

User

Log Comment

Comment #

It seems pretty straight forward, but where it gets confusing is how to handle the navigation, viewing and sorting to provide a user experience similar to making entries into a book of paper logs. The main layout should only show records for one day at a time (like turning the page in a log book). Buttons that trigger scripts are used to switch the day being viewed (previous date/next date/first/last) to jump to the group of records for the date requested.

Each set of logs for a given day should be sorted in the order they were entered. But I'm not sure how to handle the sorting and filtering of dates. I don't want the user to have to enter the current date 50 times a day, so it should obviously be an auto-entered on creation field. But what if the user wants to go back and add a log entry to a previous date? Then the auto enterer date would be set as the current date, and that entry would disappear from the date they are viewing (a date some time in the past) and reappear on the current date's set of log entrys the next time the records are resorted or filtered via a find (or whatever other method ends up being used to handle the viewing of only one day worth of logs at a time).

That's the major issue we have with the way it's currently structured. Without going into detail, the other issues relate to having to sort all records upon opening the DB, and slow performance. I'm open to any suggestions since I will be starting from scratch with the structure and importing the old records.

Link to comment
Share on other sites

I don't quite see the issue here.

If you set the date and time fields to be auto-entered, but allow user to modify their values, does that solve your problem?

Would it work to use a portal to view log entries? Have a global date field hold the date of entries to view, then use that in a relationship to view the entries for that particular day. You could also sort the relationship by date and time fields.

Link to comment
Share on other sites

I don't quite see the issue here.

If you set the date and time fields to be auto-entered, but allow user to modify their values, does that solve your problem?

It does, unless the user forgets to adjust the date when making an entry on a previous date. Which is easy to do if the field is auto entered. Then the record "disappears" from that days set of logs and ends up sorted on the current dates.

What I have done for now is use a script (triggered when the date field is saved). There's a global date field for the date of logs being viewed. The auto entered date for a log entry uses that instead of the current date. And changing the date of logs to view also changes the global. It works well but since I'm restructuing the database I'm interested in hearing about alternate methods. I described the problem as it was when I started, not as it currently works in the database because I'll be starting from scratch and would like to consider another approach. Although this part doesn't concern me as much as the sorting issue below.

Would it work to use a portal to view log entries? Have a global date field hold the date of entries to view, then use that in a relationship to view the entries for that particular day. You could also sort the relationship by date and time fields.

That's how I have it set up currently. It does work and do exactly what I need. The global field solves the date issue. But it is extremely slow to open. It was taking about 30-45 seconds to sort records before I archived a bunch of old records. And over wifi it took 10 minutes or more! I've got it down to about 5 seconds now by archiving all but the last two years of records (about 4000 records instead of 30,000). But the problem will just come back in a few years and I'd like to find a better solution.

Link to comment
Share on other sites

I don't quite see the issue here.

If you set the date and time fields to be auto-entered, but allow user to modify their values, does that solve your problem?

It does, unless the user forgets to adjust the date when making an entry on a previous date. Which is easy to do if the field is auto entered. Then the record "disappears" from that days set of logs and ends up sorted on the current dates.

What I have done for now is use a script (triggered when the date field is saved). There's a global date field for the date of logs being viewed. The auto entered date for a log entry uses that instead of the current date. And changing the date of logs to view also changes the global. It works well but since I'm restructuing the database I'm interested in hearing about alternate methods. I described the problem as it was when I started, not as it currently works in the database because I'll be starting from scratch and would like to consider another approach. Although this part doesn't consern me as much as the sorting issue below.

Would it work to use a portal to view log entries? Have a global date field hold the date of entries to view, then use that in a relationship to view the entries for that particular day. You could also sort the relationship by date and time fields.

That's how I have it set up currently. It does work and do exactly what I need. The global field solves the date issue. But it is extremely slow to open. It was taking about 30-45 seconds to sort records before I archived a bunch of old records. And over wifi it took 10 minutes or more! I've got it down to about 5 seconds now by archiving all but the last two years of records (about 4000 records instead of 30,000). But the problem will just come back in a few years and I'd like to find a better solution.

Link to comment
Share on other sites

It does, unless the user forgets to adjust the date when making an entry on a previous date.

The only other suggestion I have is to not auto-enter the date, but set validation to require a value, then put two buttons next to the date field:

1) insert current date

2) insert date being viewed

...it is extremely slow to open. It was taking about 30-45 seconds to sort records before I archived a bunch of old records. And over wifi it took 10 minutes or more! I've got it down to about 5 seconds now by archiving all but the last two years of records (about 4000 records instead of 30,000). But the problem will just come back in a few years and I'd like to find a better solution.

I'm surprised to hear about your performance issues here.

Approx. how many records are there per day?

Is the date field specified as a date field (rather than text), and is it indexed?

Does your relationship have any match fields other than global date field = date of log entry?

Does the portal display any unstored calculation fields?

Does it work any quicker when the relationship isn't sorted? (I know you need it sorted, I'm just trying to determine which aspect of it takes so much time)

You're not using portal filtering, are you?

With the old data set (30,000 records), perform a find for a particular date - how fast is that? If it's faster than the portal, you may want to use a scripted perform find rather than the portal.

Link to comment
Share on other sites

I was curious to see how quickly the portal would update, so I created a sample file with 30,000 records (100 records per day). When I change the global date field, the portal updates instantly.

The only other suggestion I have is to not auto-enter the date, but set validation to require a value, then put two buttons next to the date field:

1) insert current date

2) insert date being viewed

I'm surprised to hear about your performance issues here.

Approx. how many records are there per day?

Is the date field specified as a date field (rather than text), and is it indexed?

Does your relationship have any match fields other than global date field = date of log entry?

Does the portal display any unstored calculation fields?

Does it work any quicker when the relationship isn't sorted? (I know you need it sorted, I'm just trying to determine which aspect of it takes so much time)

You're not using portal filtering, are you?

With the old data set (30,000 records), perform a find for a particular date - how fast is that? If it's faster than the portal, you may want to use a scripted perform find rather than the portal.

I'll have to get back to you tomorrow since I don't know most of the answers off hand and I'm not at work. But to answer a few-

There are about 20 entries a day on average.

The global date is the only relationship used to filter the entries.

I dont believe there are any fields that are calcuations.

I did test it without sorting once and it was able to change page (date) views almost instantly.

What exactly is portal filtering? I am using a portal to view one days worth of logs, and that portal is on a TO that is related to the main table via the global date field. So the portal IS filtered, but I'm not sure if that answers your question.

Thanks for looking ito this though, and for the sample file. I'll check it out in the morning.

I did have one thought though. I'm going to try only sorting the one days worth of logs via the sort portal feature in the portal properties, rather than sort the whole database. I don't think it will matter as long as the portal is filtered to provide only that days records, and then, those records are sorted as a much smaller set.

Link to comment
Share on other sites

The only other suggestion I have is to not auto-enter the date, but set validation to require a value, then put two buttons next to the date field:

1) insert current date

2) insert date being viewed

I'm surprised to hear about your performance issues here.

Approx. how many records are there per day?

Is the date field specified as a date field (rather than text), and is it indexed?

Does your relationship have any match fields other than global date field = date of log entry?

Does the portal display any unstored calculation fields?

Does it work any quicker when the relationship isn't sorted? (I know you need it sorted, I'm just trying to determine which aspect of it takes so much time)

You're not using portal filtering, are you?

With the old data set (30,000 records), perform a find for a particular date - how fast is that? If it's faster than the portal, you may want to use a scripted perform find rather than the portal.

OK, for the ones I couldn't answer last night-

The date is a date field, and is indexed.

There IS one calculation. The day of the week is displayed as a caculation field based on the date. I realize that it should just be an auto-entered (one time calculation) text or date field instead of a calculation field but I never noticed it before.

And yes, I am still using FM10 so I guess I'm not using portal filtering. I just got in so I haven't had a chance to look at your example file you sent yet.

Link to comment
Share on other sites

I was curious to see how quickly the portal would update, so I created a sample file with 30,000 records (100 records per day). When I change the global date field, the portal updates instantly.

I put this on our server and it took about 6 seconds to resort records after updating th global field. Maybe since our DB has more indexed fields, more relationships and calculated field, this could be a normal response time. One weird thing I noticed was that your DB didn't sort slower over wifi though.

Link to comment
Share on other sites

The day of the week is displayed as a caculation field based on the date. I realize that it should just be an auto-entered (one time calculation) text or date field instead of a calculation field

Actually, a calculated field for day of week is fine as long as it's a stored calculation (the recalculate when needed option is not checked).

I just noticed that I forgot to add sorting to the relationship on that test file.

In the test file, go to the portalPerformance layout and perform a find for any single date (with the file hosted); how long does the find take?

Link to comment
Share on other sites

This topic is 4475 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.