We have reset all users FileMaker related profile fields. Please take the opportunity to update your information,  this will provide background to members whom read your posts. Click here.

Jump to content

_ian

Members
  • Content count

    55
  • Joined

  • Last visited

  • Days Won

    1

_ian last won the day on October 6 2016

_ian had the most liked content!

Community Reputation

4 Neutral

About _ian

  • Rank
    member

Profile Information

  • Title
    Software Architect
  • Gender
    Not Telling
  1. Given what you want to do, you might consider looking at MirrorSync by 360Works. It's a synchronisation production that allows you to sync record creation, deletions, updates between two databases. In your scenario you could configure it to sync record changes from the MySQL tables into your FileMaker system. We use it to read from our FileMaker system into a data warehouse in SQL Server.
  2. I'm using Let(_path = Substitute(Get(DocumentsPath);"/";"\\") & "Manifest_Detail_Imports\\"; Right(_path;Length(_path)-1) ) then BE_ListFilesInFolder ( $_import_directory;"csv" ) to list the contents of the directory to read the contents of \Documents\Manifest_Detail_Imports on my server doing it server-side is massively faster and less troublesome than using a client to run imports.
  3. Feature request

    Thanks Jesse. That sounds like it will sort that issue very nicely. No need to post that version. I'm off to Berlin for dotFMP at silly o'clock tomorrow morning and have a load of other stuff queued up for when i'm back. I'll just download and install when you do the next release. I've deleted all the old log files, so not in a massive rush.
  4. WebDirect stability

    I agree with Brent, we've been running WebDirect for about a year now and it's been absolutely rock solid. Slow, but rock solid. If it was my files sitting on a server that was being restarted every day, I'd be asking some questions of the hosting company.
  5. Feature request

    and... while I'm on the topic of features... I've just investigated last nights sync failure. I'm pretty sure it's down to running out of disk space. A way to restrict the maximum size of the log files would be nice. Or what might be simpler is - provide a sync warning (maybe hourly) when disk space is getting low. Do you prefer receiving feature requests directly via email or through this forum? thanks
  6. Feature request

    ​Hi Jesse It happened again this morning. Here's a screen-shot. While I'm at it. Another feature suggestion: perhaps the ability to set a user defined limit on the number of error emails? I had a sync problem at around 0457 this morning. By the time I got into the office I found close to 100 emails letting me know there was a problem. If i could limit it, it would still cover the situation of a temporary glitch which resolves itself (e.g. the error message if someone is deleting while the sync happens) but also let me know when it's a more major issue. (e.g. this mornings "Last sync failed: java.sql.SQLException: Connections could not be acquired from the underlying database!". thanks ian
  7. Feature request

    ​I'll send you an example the next time this happens.
  8. Hi guys Synch is running beautifully, really fast. We're able to sync every 120 seconds, so our Financial Data Warehouse is effectively real-time. Now it's been running a while I do have one suggestion that I would find helpful. When I have data type conversion errors - typically converting string to decimal - it would be really useful if the error message on the MirrorSync app window would tell me which table and primary key. As it stands when these errors happen I know that there's a data type issue, but need to dig through the log files to find which table and record. I'm tightening up data types throughout the system, but it's not a common problem. Once every few weeks is typical. I suppose the other alternative would be for me to write an import format so Logstash and Kibana can deal with it. Then i'd could just take advantage of all that ElasticSearch goodness. But i don't really have the time...
  9. Integration with Dropbox is definitely possible using their APIs. I've done it using ScriptMaster, but am not at liberty to share the code. GoogleDrive would also be possible, though i've not done it.
  10. I've not tried Selenium with a ScriptMaster function so I don't have enough experience to form an opinion. By coincidence I was doing a fair bit with PhantomJS last week, though with PHP not FileMaker.
  11. I'm curious why you're using Selenium here? If you use the PhantomJS sample script Rasterize.js then you can call PhantomJS from the shell and it will work nicely. I've just tested with the shell script example in ScriptMaster and it works correctly. At least on OS X - don't see why there'd be a problem on windows though.
  12. well, Steven and Wim have already explained, but here goes... The built in FileMaker security system is the most secure way to control who gets access to data within a FileMaker database. Home made security systems sit on top of that layer and are predicated on the user having access to certain data that will control what they can see. They usually rely on scripting and trying to prevent access by not displaying layouts -two approaches that are doomed to fail. They're almost always vulnerable to an attacker altering the data the controls what is visible. Over the years I've been asked to look at several of these systems and they were all vulnerable. In most cases a few minutes with data viewer and the design functions gives you everything you need to bypass the system.
  13. That is not true. While the attack vector may be the same, one approach has poor defences and the other has relatively strong defences. The probability of success is not the same.
  14. Has anyone used Nagios to monitor the FileMaker xDBC service? We've got it set up and monitoring the basic FileMaker service over 5003, the XML publishing and the web publishing. Trivially easy to set up and runs really nicely. However we're running our Nagios on a CentOS server, so there's no ODBC driver for it. So i suppose we're going to need to use a JDBC connection to monitor the xDBC service. Naturally it's the service we're most interested in monitoring as it seems to be the least reliable of the FileMaker services and it tends to fall over once in a while - though infinitely better than it was back in Server 11 world. So has anyone set that up before? or is there a better way? thanks ian
  15. ah, gotcha, that makes sense. cheers ian
×