Jump to content

naio

Members
  • Content Count

    66
  • Joined

  • Last visited

  • Days Won

    1

naio last won the day on March 2

naio had the most liked content!

Community Reputation

1 Neutral

About naio

  • Rank
    always learning

FileMaker Experience

  • Skill Level
    Intermediate
  • FM Application
    19

Platform Environment

  • OS Platform
    Mac
  • OS Version
    Catalina

FileMaker Partner

  • Certification
    Not Certified

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Again, another picky API: https://developers.sendinblue.com/reference#updatecontact With this PUT command, I can't find the way to send the --data parameter to the server. These are the cURL options I'm using: --request PUT \ --header \"accept: application/json\" \ --header \"api-key: " & $$my_api_key & "\" \ --header \"content-type: application/json\" \ --data @$parametre I've tried to set the $parametre variable to: JSONSetElement ( "{}" ; "emailBlacklisted" ; True ; JSONBoolean ) JSONFormatElements ( "{\"emailBlacklisted\":true}" ) an
  2. I have a master table that needs to calculate the total value from many fields in a related, child table. The total must be shown –not necessarily stored– in the master table layout. I wonder what's the best option: Create a non-stored calculation field in the master table with Sum (child::value) or Just display a child table summary field with the total of child::value Both options seems to work but I would like to know if there's any difference in performance, is option 2 better than 1?
  3. I didn't mean import into another file but import between tables in the same file, so shortening the "export/import" process. Export to temporary folder and importing afterwards is fine, at least in csv format.
  4. Although I haven't read it in any documentation, it seems that FileMaker Server cannot handle self imports, so I had to export and import to and from a temporary file.
  5. I have a file with 2 tables: updates on table A are imported into table B using the update matching records option based on a common id. I made the script and tested it without problem. Then I uploaded the file and scheduled the script to every x minutes. The script executes with full privileges. The scheduled execution seems to skip the import update, so table B never gets updated. The most surprising is that the script works as expected when executed manually but it doesn't do the import update in scheduled execution. I can't see any error code. Is this normal? I'm u
  6. I've tried with my non-formatted json data (as in my first post) but now excluding the single quotes around it and it doesn't work, maybe the API doesn't use a non-conforming parser, I can't tell. For debugging this I used the very useful tool offered by Mike Duncan here: https://www.soliantconsulting.com/blog/filemaker-rest/ and I see FM is not sending non-formatted data.
  7. I've found the problem: json data must be formatted with JSONFormatElement, so the cURL option in FM must be: curl -v -u <user>:<password> --header "Content-Type: application/json" --data @$json_parameters -X POST Where $json_parameters is a FM variable that stores the value of JSONFormatElement
  8. I'm trying to use an API command using Insert from URL with cURL options, this is what I have in the cURL options: curl -v -u <user>:<password> --header "Content-Type: application/json" -d '{"type": "typetest", "priority": 1, "description": "desctest", "subject": "subjecttest", "email": "my@email.com", "email_config_id":67 }' -X POST 'https://subdomain.freshdesk.com/api/v2/tickets/outbound_email' As described in the API documentation: https://developer.freshdesk.com/api/#create_outbound_email The command works well in the command line but not when executed through Inser
  9. I've found the bug: javascript code started with a single line comment //. When the code was made a calculation, line feeds were removed and because of the double slash, all of the js code was read as a comment, this is why it wasn't executed.
  10. I want to create dynamically a set of buttons to filter a portal, so when a button is clicked a FM Script is executed. Here you can see the pen: https://codepen.io/naio/pen/NWxJExP, where the alert() is called, in FM I call the FM Script with FileMaker.PerformScript() and pass the inner content as parameter. When I try this with static code in WebViewer it runs normally but when I substitute the set of buttons with a List(option_as_html_button) –the 'dynamic' part–, the FileMaker.PerformScript() doesn't work, even thought the HTML code is rendered correctly and there seems to be no e
  11. So I discarded separating container files from the solution and also keeping many full copies of the solution folder. Just in case someone is interested: Syncing the whole FMS Backups folder, with many copies of the solution with S3 or Dropbox using the NAS Sync utility proved impossible. The software disregarded hard links and syncing >1M files seemed too much to handle, the app crashed after every attempt. So I sync only the latest backup into the NAS via rsync (that's the only backup I keep locally) and then I use a backup utility to store it in S3. The backup utility takes adv
  12. Indeed I use rsync with the NAS, but with the -a option which I think it neglects hard links. I should test it without it. My NAS has an app to sync with S3 but I'm not sure if it would preserve hard links, actually I use Dropbox that I think it doesn't either. I've never used the S3 CLI and I'm not sure if it's possible from my device. How can I tell if a file uses hard links or if it's fully saved? The file size is not an indicator.
  13. So I understand that hard links are only useful in the first instance of the backup, that's within the server Backup folder. In my system I first copy from the server to a local NAS and then sync with Dropbox, so the files take its full disk space (and transfer speed) as soon as they leave the host. I'll follow your advice and will keep container data managed by FM. I'm a bit lost about the solution you suggest, using S3 to create a new backup set with only the new pdf files, if you can shed light on it I'd appreciate. Thanks in any case.
  14. You are right regarding server resources but my concern is to maintain the integrity of those hard links all across the backup chain: first to a local server and then to the cloud, I'll need to do some testing before trusting it completely. Thanks for your help.
  15. My solution uses a lot of pdf files stored externally in container fields, in open, non-secure way but managed by FM. The stored files never change its content, it's just their number that increases day by day, I have more than 100.000, about 12GB. The Automatic Backup option in FMS copies the whole Database folder where my pdfs are, this makes backup slow, less secure and space demanding: 12GB per copy. I'm thinking of storing the container data in an external folder and convert the container into a calculated field. I know this requires extra control on file and folder locations, b
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.