Jump to content

Wim Decorte

  • Content Count

  • Joined

  • Last visited

  • Days Won


Wim Decorte last won the day on March 21

Wim Decorte had the most liked content!

Community Reputation

493 Excellent


About Wim Decorte

  • Rank
  • Birthday 12/17/1968

Profile Information

  • Title
    Sr. Technical Architect
  • Gender
  • Location

Contact Methods

  • Website URL

FileMaker Experience

  • Skill Level
  • FM Application

Platform Environment

  • OS Platform
  • OS Version

FileMaker Partner

  • Certification
  • Membership
    FIleMaker Platinum Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hi Colin, To renew an SSL cert you do NOT need to generate the certificate request again, normally your vendor will just issue a new cert and you use that plus the original serverKey.pem file to import the new cert. What you are doing is called 'rekeying' a cert which is different than renewing. End result is the same except that you're giving yourself more work. The name on your FM license does not matter at all. FM does not do any kind of checks except to verify that you have the proper passkey that was set when generating the signing request. The name ownership is validated by the SSL vendor, not by FM. FM doesn't care what name you put on the SSL cert. Since you had an cert already, did you try the fmsadmin command line to delete the previous cert, before importing the new one?
  2. Use the fastest drive since this will potentially generate a fair bit of disk reads and writes. As to the ideal size": we cannot answer that since there are million variables at play for this, it heavily depends on the nature and design of your solution and anything in it that requires server-side memory (schedules, PSoS, WebD, Data API, ....) Also carefully consider how high to set the FMS cache, don't set it too high or you'll crowd what the OS can use and this will cause more frequent swap-outs of its memory to disk. Use the available monitoring tools. Set the FMS cache to 1GB and check the FMS stats.log. Use Windows Perfmon (or an extended monitoring tool like Zabbix) to figure out how much memory the system is using and how it uses the swap files). Until you have a clear picture, I wouldn't try to set a fixed swap file or if you do, then definitely check it almost constantly for the first week or so to make sure you have it right.
  3. I don't think it would be faster by default. Also keep in mind that the Data API script engine is different from the PSoS script engine, they both support different subsets of script steps. So don't assume that they can both do what is possible client-side.
  4. The FMS18 dedicated GET call to run a script won't help you here, since a GET puts everything in a URL you are limited to how long a URL can get. I would continue to do what you do but with a twist: don't use a FIND request, but do a CREATE request to a scratch table, and add a script to that call that will read from your scratch table and create the proper records where they belong (or punt that part to a server-side script if it doesn't have to be real-time). Letting a server-side script take care of creating the actual records may get around your timing issue. The Data API call would be fast since it wouldn't have to wait for the script to run. If it has to be real-time then find out where the delay comes from by timing other Data API calls to try and see if all the delay is running the script. If it is not and just the calls themselves are slow the you need to look at internet speeds and latency between you and the server. If it is the script is slow then that's a FM optimization task.
  5. Pretty sure that the BE plugin adds it automatically, same as Postman does...
  6. And a blog post that provides some background into modern auth and the various resources we've been producing: https://www.soliantconsulting.com/blog/onelogin-filemaker-authentication/
  7. There are many SSL shops around, as long as you pick one from the supported list of issuers you'll be fine. My situation is not typical since I usually deal with multi-server deployments so I almost always work with wildcard certs or SAN certs, I typically us GoDaddy for those.
  8. I would think that the mention of POST is just an error in their documentation. As to your headers: you will have to add a Content-length one apparently. You also don't have to send a User-Agent so you can skip that.
  9. As mentioned in the other thread: there's very few motions involved in installing a cert renewal, so if you think there are, let us know what your understanding of it so that we can help you. As to 18, a couple of thoughts: FMS17 will be EOL in September and the Admin API has already expired. For that reason alone I would upgrade make sure to turn off the data restoration feature in FMS18, there are a fair number of reports of it causing crashes. There is no setting in the admin console for it, it's CLI or Admin API follow the proper uninstall - rename left-over folder - reboot - install procedure otherwise you may run into issues. if you keep the FQDN of your server the same. you don't have to go through the whole CSR - cert issue, just re-use the serverKey.pem and cert already issued by your vendor
  10. There's many cheaper certificates that are supported. You can get them for around $10. It also shouldn't take hours, so do let us know what parts you are struggling with. Especially around renewals. The vendor just gives you a new key, you issue an FMSADMIN CERTIFICATE DELETE on your server and use the admin console UI to import the new cert using the old serverKey.pem I just did two of them yesterday, took me less than 10 minutes to do both. A common misunderstanding is that people think that you need to start the whole process again when your cert expires: generate a CSR, rekey the cert. You don't. Just save the serverKey.pem from your original CSR generation and just import the new cert that the vendor automatically gives you. As long as you don't change the name on the cert you don't need a new CSR.
  11. I'm liking this idea!! InfluxDB has a great REST API so you can look into that one. FM is very good at working with these kinds of APIs so sending data from FM to influxDB could be done this way without involving exports and FTPs, or considering ODBC. No moving pieces or drivers, just the two systems talking to each other directly. https://docs.influxdata.com/influxdb/v1.7/tools/api/
  12. There's a 17 collection out there too, there are a few things that were changed in the 18 Data API vs. the 17 one. You're trying to send a json object to FM but FM expect as string object. So you need to stringify your json object before you add it to the"script.param" key. Try it with just a dummy string to verify that it works and then you'll need to find a way to turn your json into string.
  13. Since FM16, FM is really good at integrating with REST APIs (or anything you can touch with cURL really). The "insert from URL" supports a great many of the cURL options, plus there are native Json functions. If you are somewhat familiar with REST APIs you'll enjoy it.
  14. I take it that you're using 17 then? The script endpoint is new in 18, I don't think 17 had it. But in 17 you can tack on the execution of a script through some of the other calls like the find. I have released postman collections for both 17 and 18, do a quick google and they should turn up. If not post back here and I'll dig them out. FMS18's script endpoint is GET only and I'm confident that we'll get a POST one at some point in the future since Claris is aware of this request. Not sure if it will be in 19 though so don't hold out for that. After 19 is released Claris has indicated an accelerated version release so hopefully we don't have to wait too long, if it turns out to not be in 19.
  15. There is very little information about it, and you've identified the big gap between the Top Call Stats log and what we know as developers about our solution. We don't know how actions (script steps, calc executions, manual finds,....) translate into the calls as reported in the log. Furthermore, the log only records the 25 most expensive calls in the logging timeframe, so you do not have a complete picture of a user's action anyway. But it's the best we got and often you can surmise what it does, but not always. One of things you can do is to host the solution on a dev server, log on as the only user and enable the log then do some of the routines that are reported as slow. That will paint a better picture of how those actions translate into calls.
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.