Jump to content

Wim Decorte

Members
  • Content count

    5,333
  • Joined

  • Last visited

  • Days Won

    149

Wim Decorte last won the day on September 21

Wim Decorte had the most liked content!

Community Reputation

456 Excellent

4 Followers

About Wim Decorte

  • Rank
    member
  • Birthday 12/17/1968

Profile Information

  • Title
    Sr. Technical Architect
  • Gender
    Male
  • Location
    Toronto

Contact Methods

  • Website URL
    www.soliantconsulting.com

FileMaker Experience

  • Skill Level
    Expert
  • FM Application
    16 Advanced

Platform Environment

  • OS Platform
    Mac
  • OS Version
    10.11
  1. better way to go through data

    my guys
  2. better way to go through data

    We differ on that point. Plenty of people are already proficient in parsing JSON. Heck, most of guys are and some of them presented on it at Devcon. And I would probably direct people towards learning JSON parsing over XSLT any place where the two formats are available as outputs. That would obviously just be a generalization, the task at hand and the skills at hand would dictate what we use in what scenario. We try to have all the tools in our tool belt.
  3. better way to go through data

    What specifically would you want to be convinced of? Perhaps I can point you to some demos. 'Easier' is often and largely a matter of proficiency. Joost indicated that he is proficient at XSLT so importing from XML is a good fit. For many others it is not. Few things are as fast as importing but we do have to consider the whole mechanics. If 'insert from URL' can give you the JSON and you can parse it right there from a variable instead of having to output something like XML to disk and potentially also output the XSLT to disk before importing then perhaps just parsing the JSON could be faster. Just highlighting options.
  4. better way to go through data

    Or json if you are using FM16 and that is an available format. Parsing json is fast and easy and probably easier to pick up than XSLT if you are not already proficient in xslt.
  5. Import XML to file from container?

    But it's still a valid approach. Not everyone is good at writing an XSLT to import the XML; some prefer to use the FM text parsing functions to get through the XML. I've done both, depending on the complexity of the XML document.
  6. Import XML to file from container?

    That's the basic mechanism: export field contents to a known location (you set that, variable or otherwise), import from that same known location.
  7. IIS 8.5 reverse proxy to FMPS 16

    Can you define the "this"? That would help us out knowing what you need to achieve. For most deployments you don't need anything as fancy as reverse proxies. You'd need those if you want to hide the actual resource or have multiple inside your LAN that you need to route to, those kinds of things.
  8. IIS 8.5 reverse proxy to FMPS 16

    Did you have it working before 16 and it just now fails? Why the reverse proxy? Perhaps there is another way to achieve the same thing and work around the problem?
  9. DDR - where to focus?

    That's why I recommended starting with the FMS logs; it keeps good track of the 4 traditional machine bottlenecks: processing power, memory, disk i/o and network i/o. If those stats show strain on any of those then the consideration is: how much strain and is there a possibility of solving it with more horsepower.
  10. DDR - where to focus?

    messy file references: any file reference that has multiple entries, some of which lead nowhere; multiple file reference entries that really point to the same file,.. Portals: understanding how a portal forces FMP and FMS to exchange data is crucial. If you have multiple portals (and worse: of the portals are filtered, sorted or use a sorted relationship) then you have multiple of those interactions going on just to draw the layout. A lot of performance problems I have had to deal with over the years come from too much unneeded data being displayed on screens.
  11. DDR - where to focus?

    Collect info from the user users about what exactly is sluggish, otherwise you may be fixing things that don't need fixing. Also use the FMS stats log and the top call stats log to help put some numbers to it. While you wait for that, try to map out some of the usual suspects: - messy file references - layouts with more than 1 portal on it - sorted relationships - layouts with lots of unstored calcs and/or summary fields on it - ...
  12. Problem when I use OnLayoutKeyStroke.

    Cross-post from FM Community: https://community.filemaker.com/thread/177497
  13. need windows 10 to store filemaker 16 password

    From the menu: File > File Options > the "open" tab
  14. need windows 10 to store filemaker 16 password

    The ability to store the credentials in Windows' Credential Manager is an option set on the FM file. If the developer turned that option off then you can't use it.
  15. Does FQL actually work?

    FQL is executed on the server as a rule. It is executed on the client if that client has an open (locked, uncommitted,...) record in the target table, any table that is part of the SQL query. Only the client's open record state counts. If other users have open records then that does not matter. Obviously their data will not be in the result set since neither server nor the executing client knows about that data yet, it has not been committed yet. When the server executes the FQL, nothing is cached on the client because no actual record data was sent down. If the client executes the query then server sends *ALL THE DATA FOR ALL THE RECORDS* in the target table, and that is cached at the client as much as the client's cache allows. It is that sending of all the data from the server to the client that is responsible for the slowdown. You can actually see this in action by looking at the FMS stats log, the "Network KB Out" counter. If there are not a lot of record in that target table then the penalty is not high, but it is very linear with the record count. In that demo file I linked to earlier I have two tables to test with, one with 100,000 or so records and one with 1.5 million.
×

Important Information

By using this site, you agree to our Terms of Use.