Jump to content

  •  

Highest Reputation Content


#386923 enter into valuelist

Posted by LaRetta on 16 April 2012 - 07:25 PM

Ya know, when you rate people with negatives, they will stop assisting you. Wouldn't you in their place? Why should we risk responding just to get slapped for it if you simply don't care for our response? You are judging whether our answers are right or wrong and you know very little about FileMaker. Would you let a podiatrist take out your gall bladder? No - you listen to those that know what you want to know.

Soon, at this rate, you will have nobody left to help you. By the way, I am not responding ... I'm afraid to. :cry:

Also, don't tell people to read the post again ... that is insulting. People here READ the posts. Your posts have been almost impossible to understand at all. That is not our fault. It is yours.
  • 6


#387207 interface check

Posted by BruceR on 19 April 2012 - 07:38 AM

Don't touch FileMaker again. Hire somebody who know what they are doing.
  • 5


#426995 Grey background in layout mode

Posted by Raybaudi on 23 August 2014 - 11:05 AM

do you have an idea, how can i change to the normal view?


Your "Maior Grid Spacing" is setted to 1 point.
You need to change it to something like 10 or more.
Or you need to unchek the box: View >> Grid >> Show Grid
  • 4


#423790 Display theme in Manage Layouts

Posted by LaRetta on 19 June 2014 - 07:05 PM

I know this is lower priority than some of the other wishes but I can't help but just mention it ...

 

If we go to Manage > Themes, we see a list of themes loaded in this file and it also indicates how many layouts are assigned to that theme.  But it would be nice if we knew which layouts were assigned to a theme and even had the ability to ... yeah I know ... probably never ... but to GTRR (so-to-speak) to those layouts or see a list of them at least.  Manage > Layouts certainly has room to list its theme along with the table and menu set.  If nothing else, this would make life easier for us.

 

When a solution has 150 layouts, it can take quite a bit of time to go through them all to find the few layouts assigned to a theme needlessly.  And each theme (even themes unused), load all definitions and take up quite a bit of space.

 

For instance, a recent example - file had 13 themes but on 3 were used.  By converting them all to the single theme, I saved 1.7MB right off the top and it took 15 minutes.  So a way to quickly jump to layouts according to their theme would be helpful.

 

Or ... am I missing something here?  Are there ways of seeing  a list of layout names per theme?

 

Thanks everyone for listening and hopefully providing a solution of which I was unaware.  Otherwise I'll provide FileMaker feedback about it.   :laugh2:

 

Also, there is a Custom Themes forum for FM12 but there isn't one for FM13.  It might be nice to remove the 'FM12' portion from the forum title so we can put FM13 questions there as well ... or ... create another forum for Custom Themes in FM13.  I prefer the former.  

 

Thank you!!


  • 4


#414575 Script Management Best Practice

Posted by jbante on 02 December 2013 - 06:12 PM

It's perfectly find, even preferable, to keep your process split out into separate scripts. There are plenty of arguments for this from different sources, but I personally think that the best reason is that human working memory is one of the biggest bottlenecks in our ability to write software. It's dangerous to put more functionality in one script (or calculation, including custom functions) than you can keep track of in your head. If you find yourself writing several comments in your scripts that outline what different sections of the script do to help you navigate the script, consider splitting those sections of the script into separate sub-scripts, using the outline comments for the sub-script names. If your parent scripts start to read less like computer code and more like plain English because you're encapsulating functionality into well-named sub-scripts, you're doing something right. (The sub-scripts don't have to be generalized or reusable for this to be a useful practice. Those are good qualities for scripts, but do the encapsulation first.)

 

I think you'd be better off if the scripts shared information with script parameters and results instead of using global variables, if you can help it. When you set a global variable, you have to understand what all your other scripts will do with it; and when you use the value in a global variable, you have to understand all the other scripts that may have set it. This has the potential to run up against the human working memory bottleneck very fast. With script parameters and results, you only ever have to think about what two scripts do with any given piece of information being shared: the sub-script and its parent script. Global variables can also lead to unintended consequences if the developer is sloppy. If one script doesn't clear a variable after the variable is no longer needed, another script might do the wrong thing based on the value remaining in that global variable. With script parameters, results, and local variables, the domain of possible consequences of a programming mistake are contained to one script. I might call it a meta best practice to use practices that limit the consequences of developer error. Globals are necessary for some things; just avoid globals if there are viable alternative approaches.

 

There is a portability argument for packing more functionality into fewer code units — fewer big scripts might be easier to copy from one solution into another than more small scripts. Big scripts are also easier to copy correctly, since there are fewer dependencies to worry about than with several interdependent smaller scripts that achieve the same functionality. For scripts, this is often easy to solve with organization, such as by putting any interdependent scripts in the same folder with each other. Custom functions don't have folders, though. For custom functions, another argument for putting more in fewer functions is that one complicated single function can be made tail recursive, and therefore can handle more recursive calls than if any helper sub-functions are called. This ValueSort function might be much easier to work with if it called separate helper functions, but I decided that for this particular function, performance is more important.


  • 4


#408620 is this ironic or not?

Posted by Wim Decorte on 15 July 2013 - 08:12 AM

My 2 cents:

 

Wherever you state your payment conditions and other important legal info is not the place to try and be funny.

Irony is an interpretation and you can not expect everyone to "get it".  Even if they "get it" you do not want to leave wriggle room from someone to take it literally and pay you in worthless currency.


  • 4


#408606 Server script on computer startup

Posted by Wim Decorte on 14 July 2013 - 12:03 PM

The trick is to use the fmsadmin command line and that syntax is the same on Windows and Mac.  So the only challenge is in calling that fmsadmin command line from your favourite OS scripting language: shell script / AppleScript on OSX and batch file / VBscript / PowerShell on Windows.

 

Below is a sample VBscript that automates shutting down FMS in a safe way.  It should contain enough pointers to do the reverse.

' Author: Wim Decorte
' Version: 2.0
' Description: Uses the FileMaker Server command line to disconnect
'               all users And close all hosted files
' 
' This is a basic example.  This script is not meant as a finished product,
' its only purpose is as a learning & demo tool.
'
' This script does not have full Error handling.
' For instance, it will break if there are spaces in the FM file names.
' The script also does not handle infinite loops in disconnecting clients
' or closing files.
'
' This script is provided as is, without any implied warranty or support.

Const WshFinished = 1
q = Chr(34)  ' the " character, needed to wrap around paths with spaces


'--------------------------------------------------------------------------------------------
' Change these variables to match your setup

theAdminUser = ""
theAdminPW = ""
pathToSAtool = "C:\Program Files\FileMaker\FileMaker Server\Database Server\fmsadmin.exe"

'--------------------------------------------------------------------------------------------


SAT = "cmd /c " & q & pathToSAtool & q & " " ' watch the trailing space

callFMS = SAT
If Len(theAdminUser) > 0 Then
	callFMS = callFMS & " -u " & theAdminUser
End If
If Len(theAdminPW) > 0 Then
	callFMS = callFMS & " -p " & theAdminPW
End If

listClients = callFMS & " list clients"
disconnectClients = callFMS & " disconnect client -y"
listfiles = callFMS & " list files -s"
closeFiles = callFMS & " close file "
stopServer = callFMS & " stop server -y -t 15"

' hook into the Windows shell
Set sh = WScript.CreateObject("wscript.shell")

' get a list of all clients and force kick them off
clientIDs = getCurrentClients()
clientCount = UBound(clientIDs)

' loop through the clients and kick them off
If clientCount > 0 Then
		fullCommand = disconnectClients
		Set oExec = sh.Exec(fullCommand)
		' give FMS some time and then requery the list of clients
		Do Until oExec.Status = WshFinished
			WScript.Sleep 50
		Loop
		Do Until clientCount = 0
			WScript.Sleep 1000
			Debug.WriteLine "Waiting for clients to disconnect..."
			clientIDs = getCurrentClients()
			clientCount = UBound(clientIDs)
		Loop
End If

' get list of files and close them
fileIDs = getCurrentFiles()
fileCount = UBound(fileIDs)

' loop through the files and close them
If fileCount > 0 Then
	Do Until fileCount = 0
		fullCommand = closeFiles & fileIDs(0) & " -y"
		Set oExec = sh.Exec(fullCommand)
		' give FMS some time and then requery the list of files
		Do Until oExec.Status = WshFinished
			WScript.Sleep 50
		Loop
		fileIDs = getCurrentFiles()
		fileCount = UBound(fileIDs)

	Loop
End If

' all clients and files stopped
' shut down the database sever (does not stop the FMS service!)
fullCommand = stopServer
Set oExec = sh.Exec(fullCommand)
Do Until oExec.Status = WshFinished
	WScript.Sleep 50
Loop

' done, exit the script
Set sh = Nothing
WScript.Quit
' ------------------------------------------------------------------------------

Function getCurrentClients()

	tempCount = 0
	Dim tempArray()
	Set oExec = sh.Exec(listClients)
	
	' in case there are no clients...
	If oexec.StdOut.AtEndOfStream Then Redim temparray(0)
	
	' read the output of the command
	Do While Not oExec.StdOut.AtEndOfStream
		strText = oExec.StdOut.ReadLine()
		strText = Replace(strtext, vbTab, "")
		Do Until InStr(strtext, "  ") = 0
			strText = Replace (strtext, "  ", " ")
		Loop
		If InStr(strText, "Client ID User Name Computer Name Ext Privilege") > 0 OR _
			InStr(strText, "ommiORB") > 0 OR _
			InStr(strText, "IP Address Is invalid Or inaccessible") > 0 Then
			' do nothing
			Redim temparray(0)
		Else
			tempClient = Split(strtext, " ")
			tempCount = tempCount + 1
			Redim Preserve tempArray(tempCount)
			tempArray(tempCount-1) = tempClient(0)
		End If
	Loop
	
	getCurrentClients = tempArray

End Function

Function getCurrentFiles()

	tempCount = 0
	Dim tempArray()
	Set oExec = sh.Exec(listfiles)
	
	' in case there are no files...
	If oexec.StdOut.AtEndOfStream Then Redim temparray(0)
	
	' read the output of the command
	Do While Not oExec.StdOut.AtEndOfStream
		strText = oExec.StdOut.ReadLine()
		strText = Replace(strtext, vbTab, "")
		Do Until InStr(strtext, "  ") = 0
			strText = Replace (strtext, "  ", " ")
		Loop
		If InStr(strText, "ID File Clients Size Status Enabled Extended Privileges") > 0 OR _
			InStr(strText, "ommiORB") > 0 OR _
			InStr(strText, "IP Address Is invalid Or inaccessible") > 0 OR _
			Left(strtext, 2) = "ID" Then
			' do nothing
			Redim temparray(0)
		Else
			tempFile = Split(strtext, " ")
			status = LCase(tempFile(4))
			If status = "normal" Then
				tempCount = tempCount + 1
				Redim Preserve tempArray(tempCount)
				tempArray(tempCount - 1) = tempFile(1) & ".fp7"
			End If
		End If
	Loop
	
	getCurrentFiles = tempArray

End Function




  • 4


#392001 "Strongest" way to pass multiple parameters.

Posted by jbante on 06 July 2012 - 06:25 AM

The two main schools of thought for passing multiple script parameters are return-delimited values and name-value pairs — the choice is a matter of taste, whether you prefer array or dictionary data structures. Both approaches have solutions for dealing with delimiters in the data. For return-delimited values, you can use the Quote function to escape return characters, and Evaluate to extract the original value:

Quote ( "parameter 1") & ¶ & Quote ( "parameter 2¶line 2" )

will give you the parameter:

"parameter 1"
"parameter 2¶line 2"

and you can retrieve the parameters:

Evaluate ( GetValue ( Get ( ScriptParameter ) ; 2 ) )

with the result:

parameter 2
line 2

Every name-value pair approach worth using has it's own approach for escaping delimiters. In my own opinion, XML-style syntax is unnecessarily verbose. My personal preference when working with name-value pairs these days is "FSON" syntax (think "FileMaker-native JSON"), which is where the parameters are formatted according to the variable declaration syntax of a Let function:

$parameter1 = "value 1";
$parameter2 = "value 2¶line2";

I only consider name-value pair functions practical when the actual syntax is abstracted away with custom functions. Once you do that, the mechanics of working with the different types should be interchangeable, such as any name-value pair function set that matches this standard. As long as you have a function to define a name-value pair in a parameter (# ( name ; value)) and another function to pull specific values back out again (#Get ( parameters ; name )), it works, and all variations should behave the same way. Any other supporting functions are purely for convenience, and probably can be re-implemented for whatever representation syntax you want. So I'd write:

# ( "parameter1" ; "value 1" ) & # ( "parameter2" ; "value 2¶line 2" )

and:

#Get ( Get ( ScriptParameter ) ; "parameter2" )

I only have 2 requirements for any parameter-passing syntax: 1. It can encode it's own delimiter syntax as data in a parameter, and therefore any text value (already mentioned in this thread), and 2. it can arbitrarily nest parameters, i.e., I can include multiple sub-parameters within a single parameter. The requirements tend to work hand-in-hand with each other. If I can do those two things, I can construct everything else I will ever need within that framework.
  • 4


#384411 Creating a separation model from an existing solution

Posted by LaRetta on 12 March 2012 - 07:49 AM

No, you can't have pure separation. And yes, you might have to make more changes in the data file if your solution is still young (and occassionally throughout) but that is easily off-set by time saved not migrating. I've designed some large systems using both separation and not. Here are the reasons that *I* prefer using separation and these reasons probably aren't the top reasons at all - but they are important to ME:

Transfer of Data (the best solution is NOT moving data)
If you use single file and design in another version (so Users can continue working in the served version) you have to migrate the data from the served file to your file and that can take hours even when scripted. Every time you move the data, you risk something going wrong and the likelihood increases each time you migrate.

With separation, the data file remains untouched.

Import Mapping (unsafe and time consuming)
Scripting the automatic export and import on every table in the single file solution takes time as well. Every export and import script must be opened and checked because you have added (and probably deleted) fields. You can make a mistake here. And even if you don’t, the 'spontaneous import map' bug might get you. It is very time-consuming and difficult, dragging a field around in the import maps (FM making it very difficult with the poor import mapping design).

No data export/import or mapping with separation.

Data Integrity (and User Confidence at risk)
When working in a single served solution, you risk records not being created when scripts run (if you are in field options calculation dialog), triggers firing based upon incorrect tab order that you just changed on a layout, and a slew of insidious underground breaks most that you will never see until much later. By then, all you will know is that it broke but not why. You risk jangled nerves and lost faith of your Users who might experience these breaks, data loses, screen flashes and other oddities. A User’s faith in your solution is pure gold and it should not be risked needlessly.

External Source Sharing (provide ‘fairly’ clean tables for integration needs)
It has been discussed many times … why we have to wade through a hundred table occurrence names (particularly if using separation model anchor-buoy of which I'm not a fan anyway) and also have to view hundreds of calculations (which external sources don’t want). It can be a nightmare.

Much better with a data file – where calculations are kept to minimum and the table occurrences are easily identifiable base relationships and easy to exchange with MySQL, Oracle and others. If FM made calculations another layer (separate from data) then FileMaker would go to the top instantly … rapid development front-end and true data back end. But I digress …

Crashing (potential increases when Developing in live system)
Working in developer mode (scripts with loops, recursive custom functions), means that there is higher likelihood of crashing the solution when designing - admit it. If you crash the solution while in the schema, it is (usually) instant toast but regardless, you SHOULD ALWAYS replace it immediately (run Recover, export the data and import it into a new clean clone).

With separation model, you never design live so it is moot. No trashed file means no recover and no migration. You are designing as we all should … on a Development box.

Transferring Changes (even if careful)
Working in live single file system, a change you think is small might have unseen and unexpected side-effects. That potential danger increases as the complexity of your solution grows. This is why we all should beta test changes before going live. But even if careful and even if you design on Developer copy and test it, document your changes and then re-create those changes into the served file (eliminating the data migration portion completely), you risk mistyping a field name, script variable, not setting a checkbox and so forth.

Separation also can risk error because it can mean adding a field or calculation to the Data file but it is very small percentage of the changes we make (except very early); most changes are layout objects, triggers, conditional formatting…UI changes. It is more difficult to replicate layout work between files. And most changes in the data file occur before the solution is even made live the first time.

Same Developer experience (Developer works in single file regardless)
In separation, all layouts and scripts reside in the UI no matter how many Data files are linked as data sources. The only time you have to even open the Data file is to work in Manage Database. Added ... or set permissions.

Fewer Calculations (means faster solution?)
If one decides to go with Separation, the goal is no calculations or table occurrences in the data file but that goal is not obtainable. Since this is your goal, you try to find alternate methods to achieve the same thing to avoid adding calculations (which you should do anyway). You will use more conditional formatting, Let() to declare merge variables, rearranging script logic to handle some of it … there are many creative options. And once you start down that path, you will find that you need fewer calculations than you ever imagined.

Separation model means I can work safely on the UI aside and quickly replace it while leaving the data intact with the least risk. Some calcs need a nudge to update and privileges must be established in the data file as well – those are the only drawbacks I have experienced with it. I have never placed the UI on local stations. I have heard (from trusted sources) that it increases speed. There was a good discussion on separation model here: https://fmdev.filema...age/65167#65167

Everyone has their opinion. I did not want to go with separation and I was forced into it the first time (by a top Developer working with me, LOL). I am glad I was. Once I understood its simplicity and that everything happens in the UI, I was hooked. Of course if it is a very small solution for one person then I won't separate but for a growing business with data being served, I will certainly use it.

Next time that, in your single-file solution, you go through a series of crashes because of network (hardware) problems and you have to migrate every night, think about separation. Next time you need to work in field definitions but can't because Users are in the system, think about separation. Most businesses are constantly changing and that means almost constantly making improvements to a solution. You can NOT guarantee that changes in Manage Database or scripts will not affect Users in the system.

Risk is simply not in my dictionary when it comes to data. Enough can go wrong even when we take all precautions. I don't dance with the devil nor do I run with scissors. :laugh:

Joseph, I think it would be easier to start from scratch because it makes you re-think each piece. And, as you learn, you will find better ways of re-creating each portion of it. I would also wait until the next version arrives to take full advantage of its new features. It won't be long now.
  • 4


#382527 Conditional format drop-down calendar?

Posted by LaRetta on 14 February 2012 - 11:17 AM

Hi Charity,

We all deal with wanting to eliminate the display of buttons, checkboxes, drop-down calendars and such on that last row when Allow Creation is in effect. Instead of having to eliminate the objects individually, you can accomplish it all with conditional formatting AND make it clear to your User that they click the last row to add a new record. Here is how:
  • Create a text box with the words [ Add new record here ... ]
  • Make the background transparent. Color of text does not matter
  • Resize the text box to same size as single portal row
  • Format the text box as follows (using conditional formatting)
  • First entry: Formula = not IsEmpty ( the primary key from the portal table occurrence ). Then below select ‘more formatting’ and set the font size to custom and 500.
  • Second entry: Formula = IsEmpty ( the primary key from the portal table occurrence ). And below set the text to white and the background to black as example. You can set it anything you wish.
Select this text box and Arrange > Bring to Front and place it over your top portal row. This one text box hides checkboxes, drop-down calendars, buttons, lines on empty fields, etc only on the empty row and also provides a clean row in which to provide your message.

I use it on all Allow Creation portals - just by changing the ID referenced within the conditional format portion.
  • 4


#426834 how to start with server if stupid

Posted by Ocean West on 19 August 2014 - 05:49 PM

Charity,

 

Since the best experience is if the end user has all the data they need local to their iPad and not rely on connectivity you may try to do this:

 

Create your solution in ONE file with multiple tables one table for your catalog another other table for your customer data entry. 

 

If your catalog only changes every so often that is fine you can push out a new version to the end user to replace their database with all the catalog data.

 

If they only need to add new customers and not look up historical customers then you can essentially have them email data (export as TAB/CSV/or EXCEL )  from the customer table for you to import in to your own file, every night.

(not ideal but its a work around)

 

Since cost is a concern for you please look at this http://fmeasysync.com, its a free open source framework for syncing your database - it may be a bit overwhelming at first but I am confident that you can get there!

 

-

 

Charity every year at the developer conference we (old salts) wonder where the "next generation" of FileMaker Developers are coming from, if you have a passion for this and

accept the challenges head on, you most likely will be reciprocating your learnt knowledge and wisdom in short time to other newbies jumping in to the fray.

 

-

 

What goes into the External Data Sources - really depends on where the files will ultimately be deployed, if two files are hosted then they would use relative paths:

file:MyDatabse.fmp12

if one file is local to a device then it needs the fully qualified URL path with either the PUBLIC IP address, or a domain name, if using a 10.0.x.x or 192.168.x.x this is an internal IP address and wouldn't be accessible out side your environment.

fmnet:/192.168.10.10/MyDatabase.fmp12
fmnet:/fms.somehostingcompany.com/MyDatabase.fmp12

if you would like to see how can be access when hosted please send me a private message and I could host a sample file for you that you can test with. 

 

Stephen


  • 3


#421376 Global storage fields not keeping their data and driving me crazy...

Posted by Wim Decorte on 24 April 2014 - 03:52 PM

What Bruce is saying.

 

Global fields, just like global variables are not meant for persistent data storage, their content basically dies at the end of your session.

Global fields can retain a value between session under some circumstances but it is not considered best practice to rely on that.


  • 3


#419044 'Live, As-You-Type Search' With a twist?

Posted by comment on 05 March 2014 - 05:27 AM

What I would like is for just words/strings that start with 'Ma'

 

IIUC, you want to find only records where the field starts with [...]. For this, set the searched field to:

"==" & $thisterm & "*"

  • 3


#418304 All entry points

Posted by Steven H. Blackwell on 14 February 2014 - 05:44 AM

OK, first things first.

 

1. Someone needs to make an assessment of the level of sensitivity of the data that will be in these files and the level of adverse impact to the organization if a breach occurs.  That assessment can guide your decisions about the level of security needed.

 

2.  Create in each file a Privilege Set for each role that will be using the system.  Assign that role the privileges it needs.

 

3.  When #2 is completed--and it can be an on-going process--you can create Accounts for each person who will access the files.  Those Accounts can be internal to the files, or they can be externally authenticated by Active Directory, Open Directory, or local server groups.  Each Account is assigned to one of the Privilege Sets you created in #2.  THis is done directly in the file for internal Accounts, or through matching Groups for external authentication.

 

4.  This system then propagates through to all client types:  FIleMaker Pro, FileMaker Pro, and WebDirect.™ When a user is challenged for credentials, the user enters his/her Account name and password.  if authenticated, the user has access to the files with the privileges granted in Step #2.

 

 

Finally, two other items.

 

If your server is not robust, you won't be using WebDirect for more than 4 or 5 users. Larger WebDirect deployments require very robust servers.

 

I would also recommend that you get someone who knows FileMaker Server and FileMaker security to come into your organization and give you some assistance before you deploy all of this. It sounds as if your situation is complex, and if you don;t do this correctly at the outset you're going to be plagued with issues.

 

Steven


  • 3


#414646 ListFoundSet kinda...

Posted by Mark Scott on 03 December 2013 - 09:30 PM

Hi John,

 

I tested this using Todd Geist's Hyperlist file, which is preloaded with 200K records.  Up until now, the brilliant hyperlist technique has been the fastest way (that doesn't mess with the user's clipboard) to gather IDs.  It remains a wondrous piece of code optimization.

 

I created a new field for gathering IDs: "list_id" (the new "list of" summary field type, referencing the UUID-based "ID" primary key field).  Then, I wrote a benchmarking script to gather the 200,000 IDs into a global variable either by hyperlist or by GetSummary ( TestData::list_id ; TestData::list_id ) and report the delta time between start and end of ID-gathering step (using the new Get ( CurrentTimeUTCMilliseconds ) function).

 

The speed bump was impressive:  

 

     hyperlist:  6.8 sec (averaged over several runs*)

 

     GetSummary (list_id):  2.1 sec (averaged over several runs)

 

I then repeated it, but this time gathering values from an unstored field that grabs FM's internal record IDs ("Get ( RecordID )"), and a "list of" summary field:

 

     hyperlist:  8.4 sec (averaged over several runs; presumably higher than the UUID primary key field because the ids are now being gathered from an unstored field)

 

     GetSummary (list_id):  1.7 sec (averaged over several runs; despite IDs now coming from an unstored field, it performed even faster than the UUID test above, possibly because the values were all numeric and smaller in size than the alphanumeric UUID values)

 

Finally, I repeated the second set of tests, but with a stored field that grabs FM's internal record IDs:

 

     hyperlist:  6.0 sec (averaged over several runs)

 

     GetSummary (list_id):  1.0 sec!!!!!   (wow!)

 

(*Note that when I say "averaged," the variance was small.  In the first test, for example, the 3 times were 6.813, 6.781, and 6.801 sec.)

 

So, yes, I agree that this new feature will likely prove quite useful and is one of the cool "sleeper" features slipped quietly into this very marquee-feature-rich release.

 

Mind you, these tests were all just local; over the LAN/WAN a whole different set of kinetics obviously come into play.

 

Best,

 

Mark


  • 3


#414639 ListFoundSet kinda...

Posted by mr_vodka on 03 December 2013 - 08:39 PM

One of the features that are in FMP13 is a new option for a summary field that will gather the records into a list ( List Of ). Though it isn't exactly a calculation or function that we were looking for, it is great that we finally get something that has been needed for a while.

 

I haven't tried benchmarking it against copy all though.


  • 3


#414600 Version 12 icon for Mac users

Posted by rivet on 03 December 2013 - 11:06 AM

Well version 13 has been release but it is the same icon as v12.  Since you can run multiple version on the Mac a visual distinction always helps.

 

I have attached a new 'FM12DApp.icns' file to put into the apps content / resources folder.

 

After you replace it you can 'get info' on the app, select the icon in the top left corner, then copy / paste to  refresh the new icon.

Attached Files


  • 3


#413680 ExecuteSQL Custom Function

Posted by Wim Decorte on 10 November 2013 - 09:22 AM

FileMaker Server will try to execute the SQL query on the server so you will only get the result.  If that result is a record set of thousands of records then obviously that is what FMS will have to send you.

 

There is one exception to FMS doing the query for you: if you have an open record in your session for the target table then FMS will send you the whole table so that your client can perform the query including the results of your open record.  That is usually not a problem unless there are more than say 10,000 records in the target table.  And the wait becomes exponentially longer the more records in the target table.

 

An example:

 

Table with 1,000,000 records.  You construct a SQL query that results in 1 record.  You have no open records in the target table (if other people have open records, that does not matter) --> FMS will do the query and return you the data for the one record.  And it will be fast

 

Table with 1,000,000 records.  You construct a SQL query that results in 1 record.  You DO have open records in the target table (if other people have open records, that does not matter) --> FMS will send you the data for ALL 1,000,000 records and your local copy of FMS will execute the SQL query.  It will be very slow.


  • 3


#412944 Why does ScriptMaster call home?

Posted by Jesse Barnum on 25 October 2013 - 12:38 PM

360Works plugins talk to a licensing server when they are first launched to validate the license key, and to ensure that the product is not exceeding the maximum number of licensed users.
 
In addition, when the plugin is shut down, it sends a signal to the license server asking it to decrement the count of connected users.
 
However, if that's all we did, then any unexpected shutdown would leave an orphan record showing a connected user, which would cause the count of connected users to be incorrectly higher than it should be. That's why we send a heartbeat signal every 30 seconds - if the license server does not receive this signal on a regular basis, it knows that FileMaker exited unexpectedly, and it decrements the user count.
 
Here is the data payload that we receive when the plugin connects. There is no personal data in here. This particular example is for ScriptMaster:
 
Section 1:
<LicenseCheck><RegisteredTo>[changed]</RegisteredTo><LicenseKey>[changed]</LicenseKey><ProductCode>48</ProductCode><MajorVersion>1</MajorVersion><MajorReleaseDate>1279166400000</MajorReleaseDate><VersionString>4.201</VersionString><ExtraInfo></ExtraInfo><FmEnvironment><Platform>Windows 2003</Platform><JavaVersion>1.7.0_21-b11</JavaVersion><Architecture>x86</Architecture><Language>en</Language><Country>US</Country><ApiVersion>52</ApiVersion><AppType>3</AppType></FmEnvironment></LicenseCheck>

Section 2:

<ApplicationVersion>ProAdvanced 11.0v2</ApplicationVersion><HostApplicationVersion>Server 10.0v2</HostApplicationVersion><HostIPAddress>[changed]</HostIPAddress><MultiUserState>2</MultiUserState><SystemIPAddress>[changed]</SystemIPAddress><SystemNICAddress>[changed]</SystemNICAddress><SystemPlatform>1</SystemPlatform><UserCount>0</UserCount>
 
We carefully engineered this to use minimal resources. The heartbeat signal is sent out by a background thread that only wakes up once every 30 seconds and uses just a few milliseconds of CPU time before going back to sleep. Since it's on a separate thread, it will not block the main thread from running, even if it is unable to communicate with the server for some reason. It also uses an extended HTTP keepalive socket so that it's not having to re-connect to the server for every request.

  • 3


#411610 How to prevent "scripting error (401)" in server log?

Posted by Wim Decorte on 26 September 2013 - 03:51 AM

Do an ExecuteSQL to do a quick select, if that comes up empty you can skip the real FM find.


It's important to understand though that "Set Error Capture On" does not prevent the error from *happening*.  The only thing it does is hide the error from the user so that you can handle it silently.

If you run the same script in your debugger you will see that FM also reports the error.


  • 3


FMForum Advertisers