15 posts in this topic
By Bailey Kessing
I am running a server script periodically (every hour lets say) which does some database maintenance, uploads some files, etc. I do this on a number of Filemaker Servers (latest server version running on Windows machines) without problems…except for one server. On this server I use scriptmaster to up load some large files into containers as they are generated from scientific instruments. The problem is that the server runs seems to run out of RAM memory after a "few" days. I think this is the only thing that this server does differently than the other 4 servers and feel that this is the problem. My question is…is there a way to "flush" memory used by the plugin or by the FM server. Has anyone else experienced something similar or am I barking up the wrong tree? Thanks.
I suspect this has been answered before but my searches aren't finding what I need.
I have two databases and I'm linking fields from DB2 to the interface in DB1. I have external authentication going. When user connects to DB1 their asked for authentication for DB2. I don't what them to have access to DB2 and I don't want them to see DB2 as would happen if I were to even give them read only.
I've setup the External Data Source between the two FileMaker databases. What might I be missing? I should be able to display data from other databases without giving the user read access to all the databases.
I hope you understand what I'm trying to say and thank you for any help.
Where's my Boot-Camp,
Hi, I've purchased a CloudMail license. I'm going to use it in an environment with 1 FileMaker Server v14 and 12 FileMaker Pro v14 clients.
The database is already on the server, and the plugin is activated, but I'm not able to use CloudMail functions with a client that doesn't have the plugin installed.
How can I register the license and use the database from a client?
By Gianluca D'Aquino
I'm just investigating on 2 issue I have on sync 2 DB, 1 on the server and 1 on the iPad that are set to use a Spoke Db in the middle tier.
Basically the sync is set to copy ALL the data from some tables on the server to the local db and send some data from the local db to the end server db. For some reason, on a large table, called Articles, with more than 130k records, the sync does not copy all the record for some users. There is no 'filter' based on user on the end db, and I run a pre-populate sync on the spoke db that copied all 130k records. When the sync is run for that specific user, the final result on the iPad is that the sync DELETES 12k records for no apparent reason.
In addition, there is a Global table that has always 1 records and is get from the server to the client. But I found some times that there are several records on that table (more than 10) on the server, even if the sync is set to write on the spoke and not viceversa for that table. I have the suspect that are written from the sync. Is it possible?
Thanks in advance,