Jump to content


  • Content Count

  • Joined

  • Last visited

  • Days Won


OlgerDiekstra last won the day on February 14

OlgerDiekstra had the most liked content!

Community Reputation

26 Excellent

About OlgerDiekstra

  • Rank

Profile Information

  • Title
    FileMaker Developer
  • Gender
  • Location
    Gold Coast, Australia

Contact Methods

  • Website URL

FileMaker Experience

  • Skill Level
  • FM Application
    16 Advanced

Platform Environment

  • OS Platform
  • OS Version

FileMaker Partner

  • Certification

Recent Profile Visitors

8,539 profile views
  1. If I may ask, what's the purpose of using such ID's? How do you plan to use this?
  2. Rather than trying to solve this with technical solutions, I'd suggest getting their manager/hr to sort this out. What makes you think they wouldn't steal whilst in the office? It's a bit more risky but no doubt there are plenty opportunities to save data unseen and transport it outta your network. Connecting to Tor is quite trivial, especially as it doesn't require installation. Rather than attempt to fight this with technical solutions, get rid of the problem. What you need to do is collect proof. And then hand it over to HR or their manager.
  3. Have a look on the Brian Dunning website: https://www.briandunning.com/cf/87 That's just one of them. Searching for 'Luhn' will return a few more hits.
  4. I don't use the 360works plugins so can't tell you which script commands to use. The problem you're (most likely) encountering is that both your server desktop and terminal are running in the userspace of the user your logged in as. Filemaker server doesn't run in that userspace. Therefore the driveletters (and credentials) that you use in your userspace are not available to filemaker server. To make this work, you first have to connect to the remote shares with valid credentials from the script environment when the script is running. So basically you need to run a 360works plugin script step that will allow you to either invoke the net use command or invoke a command shell that allows you to invoke the net use command. Net use \\Server\share\ F: password /user:username Then you should be able to access the share and get a listing. I think send event should also work, though I usually use BaseElements for stuff like this.
  5. This https://www.gizmodo.com.au/2019/07/7-elevens-bad-app-design-let-criminals-steal-more-than-500000/ could well be an excellent example of technical debt and the cost. Possibly the developers meant to go back to the reset process but pressure to deliver might have prevented this, or it may not have been documented and they simply forgot. Very expensive exercise.
  6. It's called Technical Debt. https://en.wikipedia.org/wiki/Technical_debt https://samuelmullen.com/articles/the-high-cost-of-technical-debt/ And like financial debt, it accrues interest. The longer you leave it in place, the more expensive it becomes to get rid of. You pay the interest in time (which is essentially money). I have inherited a DB as well that accumulated a lot of technical debt and interest (10 years worth). It's a slow and sometimes painful process to turn it around. The best approach is to draw a line in the sand , and start turning things around, little by little. You don't have to create a new database per se, you can work within what there is now. Create new tables and relationships fit for purpose. 125k records isn't large. My database has almost 1 million records just for payments. 350k+ customers, double that in invoices. Almost 600 layouts. Probably more than half is redundant. 193 tables. It's a giant web but one that I understand (mostly).
  7. If this is a hobby project, why not post your file and let people here tweak it a bit? It's much easier to see what you're trying to do when the actual database is available.
  8. Your diagram looks like an inverted anchor-buoy. For readability I'd flip it. The Survey TO is the anchor, all other TO's hang off it. I agree that the SurveyMoment is unnecessary based on the info provided. The Response table would have a Created (date) and CreatedBy fields (ideally) and an Answered (date) and AnsweredBy field (along with a Modified (Date) and ModifiedBy field). That automatically gives you the info the SurveyMoment would store. Each Response record would also need to link to a Question (therefore store the id of the question) as that is what it relates to. It doesn't really relate to the survey so much. Looking at the Survey and the Responses tells you nothing, you have to look at the Survey and Questions to make sense of the Responses. The only purpose a SurveyMoment table could serve is if you want to record random and changing information about the time the survey is taken. Ie, sometimes recording time of day might be useful, other times whether the survey was taken right after a meal. Or something like else.
  9. With typed or computer printed material scanning and OCR is fairly accurate though still needs reviewing and correcting. Handwritten material is way harder because handwriting can be very hard to discern, even for humans. Humans can deduct illegible parts by looking at content around it, which is much harder for computers. Even if you did get reasonably correct imports you'd have to check everything anyway for accuracy. I'd do some trials scanning the material and OCRing it. With ledgers, you can't really have mistakes as that throws all the figures out. OCRing stories is much more forgiving as people can deduct incorrect words, but incorrect numbers throw your entire ledger. Probably the way I'd do it if it really must be converted into a DB, is to scan each page into a container, and accompany it with some fields that specify the totals of each page (or other significant info). That will allow you to do calculations and review the scanned copies if need be.
  10. This calculation: Case ( customer::Transpo 1 = "Rail" ; "Rail" ; customer::Transpo 1 = "Globus Garage" ; "Globus Garage";Transpo 1 = "Gambrinus"; "Gambrinus";customer::Transpo 1 = "Tefra";"Tefra";customer::Transpo 1 = "Sprinter";"Sprinter";customer::Transpo 1 = "Individual";"Individual" ) does nothing useful that I can see. It returns whatever value is in customer::transpo1. You might as well return customer::transpo1.
  11. I think what OP is after is a re-validation for certain (senstive) areas in the solution as an extra security (presumably to ensure the logged in user hasn't left the workstation and someone else is accessing using the logged in user). This can be done by grabbing the logged in user ( using Get( AccountName) ), and using the relogin script step to relogin the same user. You can then capture the result and cancel the action if the password was not correct.
  12. CPU's can be swapped out easily too. Create a virtualized environment. Depending on your requirements, you then also have the option of creating a cluster of virtual hosts and load balance servers across the cluster. If one dies, the performance might drop, but you're still up and running. And adding CPU (or any other) resources to VMs is trivial. Plus, you can create backups of your entire VM and load it elsewhere quite easily. Upgrading your FMS then also becomes much less of a problem, ie I've got a Windows 2k8R2 server running FMS, which needs to be upgraded to Win2016 for me to be able to run v18. Using a VM I can simply install a new server within my environment, build it when its convenient, transfer DB's for testing and when I'm happy I can simply replace the old server with the new on a cold winter night. If I had a physical FMS server, I would need to purchase new hardware or work very hard over a weekend to ensure everything was back up and running come Monday. I could do that because the business I work is closed over the weekend, but that's not true for a lot of businesses. In a clustered virtual environment, you could add a new host with heaps of resources, migrate all your VM's across and no one would be the wiser. Then you decommission your old host(s).
  13. For sure you could. How exactly would depend on your requirements.
  14. I have an iPad app that uses 2 databases. One is mostly for the UI but has some tables with data, and the other is predominantly an image repository that is used by the first database. The reason for this separation is that I don't have to redownload all the images (400 or so at the moment) every time I do an update to the app. The Image repository only has a couple of tables and barely any scripts or UI components. The UI part is only 1.5MB or so, as opposed to 150MB for the images. When loading images (which is done from the UI app which has the image repository set up as a data source) I use auto enter calcs to extract some data from the images (resolution, image name, etc) for later use in the app. However, I found that the auto calc fields don't fire when I insert data via the external data source (I have the image tables setup in my UI app in the schema with relationships etc.) Does anyone know if this is expected behaviour? I would have though the auto enter calcs would fire regardless, but this doesn't appear to be the case when the table is in a different database. Forgot to mention, I'm on FM16.
  15. You're not creating a list of your CC's. Assuming a CC line of: "Person 1" <p.one@here.com>, "Person 2" <p.two@here.com>,p.three@here.com,"Person 4" <p.four@here.com> First create a value list of the CC line: Substitute( $CC; [ ">, "; "¶" ]; [ ">,"; "¶" ]; [ ","; "¶" ] ) This will give you a list like this: "Person 1" <p.one@here.com>¶ "Person 2" <p.two@here.com>¶ p.three@here.com¶ "Person 4" <p.four@here.com> Then you can apply your calculation to get just the email addresses: $count=1 $newCC = "" loop $email = middle( GetValue( $CC; $count ) $email = Middle( $email; Position( $email; "<"; 1; 1 ) + 1; Position( $email; ">"; 1; 1 ) - Position( $email; "<" ;1 ;1 )- 1 ) $newCC = $newCC & "¶" & $email $count = $count + 1 exit loop if $count > valuecount( $CC ) end loop $newCC will end up with a list of just emailaddresses. p.one@here.com¶ p.two@here.com¶ p.three@here.com¶ p.four@here.com
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.