Jump to content

Doug Gardner

Members
  • Content Count

    6
  • Joined

  • Last visited

Community Reputation

0 Neutral

About Doug Gardner

  • Rank
    member
  1. Thanks for the ideas, Dan. If I hit a snag, I might end up going the route of an OS-level script, but I'm trying to avoid it because processing the results requires bringing in other technologies and external connections, and I think its best to have the smallest possible set of dependencies—the simpler it is, the easier it is to troubleshoot, and the less it relies on being really clever.
  2. Hi Claus, I appreciate the response. Actually, the behaviour is a little bit different than you describe. There is no prompt to select another route to the missing file under a specific set of circumstances. For the internal check (that is, the script running server-side, checking to see which of its files are open), as long as you're running the script with Set Error Capture [On], there is no pause and no dialogue when a file is not open or not found. The attempt to run the script in a missing file just returns an error 100. For the external check (that is, the script running on t
  3. Hi Claus, That's very close to the solution I've arrived at, though without the XML part (because it's not an option). If you've got a minute, I'd appreciate any critical comment on the basic method outlined in the previous post. Thanks!
  4. I was thinking that the function could be rewritten so that when it's run from FMS, it observes the purview of the administrator group from which the scheduled script runs. That could be a non-trivial rewrite, though, because functions are designed to operate within the client and might not be able to reach into the server environment without doing significant work. Regarding the initial post... I agree that monitoring cannot be done with FMS if you want a positive result—something outside the target FMS must actively perform the checking, at least ultimately. Here's what I think I'm
  5. Hey Wim, thanks for responding. Yes, I agree that the challenge is collecting the data. Ideally, the data would somehow be gathered at a central location and a routine would check each set of files against the canonical list for each location, then the results would be summarized for each server, possibly in an email. So, once or twice a day the admin would get a simple report. One thing that I just remembered, though, is that I have seen cases where a FM Server reports that a file is open, it's status is "normal", and the file isn't really available on the network. I can't remember where
  6. I'm looking for the best way to test that files are open and "normal" on a number of FM Server 13 Windows deployments that a client has scattered all over the place. Note that this is testing for a positive result, not looking for a negative result (like, that a file isn't open because of problem). There are a couple of ways that come to mind: Write an OS-level script to get the filenames and status from fmsadmin at the command line, send them to a text file and deal with that appropriately. Another way to do it is have an FM script do something like check that each file is open on the ho
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.