Jump to content


  • Posts

  • Joined

  • Last visited

  • Days Won


jkluchnik last won the day on February 27 2012

jkluchnik had the most liked content!

Profile Information

  • Gender
  • Location

FileMaker Experience

  • Skill Level
  • FM Application

Platform Environment

  • OS Platform
  • OS Version

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

jkluchnik's Achievements


Contributor (5/14)

  • First Post
  • Collaborator
  • Conversation Starter
  • Week One Done
  • One Month Later

Recent Badges



  1. I'm looking for the ability to extract all words in a word document that are wrapped in "<< >>", such as Hello <<Name>>, Please respond by <<Date>>.... I would like this to return: <<Name>> <<Date>> I know the pattern match function in scribe should do this, but I can't figure out what to put in the Regex part. Is there a reference that explains this?
  2. Hi Wim. Did that before posting. Doesn't seem to be the issue. Although I still have not found the answer to what I have posted, I remembered that I encountered a similar problem some time ago and have just tried the same solution. It did work. The fix is to change the Find[Restore] to Enter Find Mode, Set Field..., Perform Find. This works without a hitch. I had just forgotten about this. Would still be interested if someone has an answer as to why the other does not work, but I can live with the workaround. Thanks for taking the time to respond.
  3. I have the following script: Set Error Capture [ On ] Set Variable [ $filterDate; Value:Get ( CurrentDate ) - 2 ] Go to Layout [ “SecPrices” (SecPrices) ] Perform Find [ Specified Find Requests: Find Records; Criteria: SecPrices::date: “≤$filterDate” AND SecPrices::secPriceValid: “Invalid Price” Find Records; Criteria: SecPrices::date: “≤$filterDate” AND SecPrices::secPriceValid: “Ccy Mismatch” ] [ Restore ] Set Variable [ $lastError; Value:Get ( LastError ) ] Set Variable [ $foundcount; Value:Get ( FoundCount ) ] Send Mail [ Send via SMTP Server ] [ No dialog ] If [ $lastError ≠ 401 ] // Delete All Records [ No dialog ] End If The script works fine on a client but running scheduled in server ends up deleting all the records in the table. Therefore, the Delete All Records step is commented out so not to execute and I have put in a send mail step to be able to capture the error. Running in client I get an email saying that the error was 0 and the found records were 908. However, if run via schedule on server, I get error 500, found 21327 (which is the total number of records in the table). Error 500 is "Date value does not meet validation entry options". I cannot figure out where the problem is. I have tried replacing the "≤" in the find criteria with "..." but this does not help.
  4. I have a field called accInterestEndDate with the following calculation: Let ( [ ~nextEntry = GetNthRecord ( date ; Get ( RecordNumber ) + 1 ) ] ; Case ( IsEmpty ( ~nextEntry ) ; portfolioDate ; Min ( ~nextEntry ; portfolioDate ) ) ) In the first record, date equals 01.01.2011. In the second record, date equals 02.02.2011. portfolioDate is a global field that equals 06.11.12. With the above calculation, accInterestEndDate in the first record should be 02.02.2011. However, it calculates to 01.01.2011 ( its own date field). I can fix this by changing the variable ~nextEntry to GetAsDate ( GetNthRecord ( date ; Get ( RecordNumber ) + 1 ) ). Then the correct result of 02.02.2011 appears. However, I don't understand why this fixes the problem and I don't like implementing a solution without understanding it. Now where it gets really strange. I added a field calleddateNextRecord which has the calculation GetNthRecord ( date ; Get ( RecordNumber ) + 1 ), without wrapping it in GetAsDate(). Then, I have the field accInterestEndDate with the following calculation: Case ( IsEmpty ( dateNextRecord ) ; portfolioDate ; Min ( dateNextRecord ; portfolioDate ) ) Strangely enough, this calculates correctly. My understanding of the GetAsDate() function is that it will convert text to a date. However as the date field in the portfolio date field are already formatted as date fields, I don't understand why this corrects the error. Any input would be helpful. Please see attached file. DateCalc.fp7.zip
  5. Hi Don, Thanks for your example and your comments. You may have missed that I wrote I actually have solved this as well, years ago and do it with a process which creates opening balance temp records, and then deletes them. I was just hoping to hear that someone much smarter than me has an obvious simple way to do this that I have been overlooking for years (this happens to me a lot on this forum!). Unfortunately, it seems that in this case I need to stay the course.
  6. Hi Mandy, Vaughan is absolutely right. Given the power of the relational database, you would have MUCH more flexibility, i.e., begin able to enter the same violation more than once in a record, and much more power in your reporting. Having said that, based on what I can see from the layout you posted as an image, you could sum up the groups totals and then full total as follows: Assuming 2 violation categories and 3 different violations per category, you would need 2 category total fields, and one full total field. Category 1 Total = PatternCount ( List ( Violation 1; Violation 2; Violation 3 ) ; "Y" ) Category 2 Total = PatternCount ( List ( Violation 4; Violation 5; Violation 6 ) ; "Y" ) Total = Sum ( Category 1; Category 2 ) (As I don't have your sample file, I don't know how you have Y equaling 1. Therefore I am assuming that the Y and N are just text values. If in fact you have a field , call it Violation X Value, that equals 1 when the violation is Y, then you could substitute the Category X Total formulas with Sum ( Violation 1 Value; Violation 2 Value), etc... See attached file. Please note: A downside to this approach is that your report will then list all the violation categories, even when their value is zero. With Vaughan's approach, your reports would be much more concise, showing only actual violations. However, you may actually WANT the report showing the zero values. I know in my business, we have situations where our auditors want to see an affirmative "zero", which shows an intentional entry of the negative. In a case like that, your approach may be exactly what is desired. Jorge mandy.fp7.zip
  7. Unfortunately that is not a great solution for us. Much of our data is uploaded, or is entered automatically by other processes. Data may be entered into this table "out of order" so it means that every time a record comes in, it would have to reset that field throughout the entire table... Thanks for the suggestion Fitch!
  8. You see, that's why they call you..... well, comment You saved me a potential huge headache in my existing solution. I will now be changing a whole bunch of scripts!
  9. As I said, my "one function less" comment was me just trying to look smart. I have no idea if that makes any difference. I'm only trying to learn the different applications of the various functions... As for the way you wrapped the List and the self in the ¶ at beginning and end, why do you do that? It was my understanding that the results of a list would be in the format Value & ¶. Why add the leading ¶?
  10. Comment, as I am always trying to learn... As I said, your method for duplicates is GREAT! It made me rethink a method I use for checking if something already exists in a value list and I tried it in your duplicates solution. It also worked. I am wondering if there is an advantage of using one method over another, and if so, what is it: Your solution used the conditional format: ValueCount ( FilterValues ( List ( Child::Category ) ; Self ) ) > 1 This uses three functions. I tried using PatternCount instead of ValueCount and FilterValues, and adding the "¶" to the pattern. This then is just two functions. PatternCount ( List ( Child::Category ) ; Self & "¶" ) > 1 I read a lot about "overhead" and the like and I have no idea if this makes any difference whatsoever. Again, just wondering out loud as I always learn great things from you.
  11. I love your wry, 'minimalistic' statements Great solution! I feel all that, and lets add STUPID, every time I am on this forum!
  12. Hello all. I have an issue for which I have been using the same solution for a long time but was thinking that perhaps I am missing something and one of you brilliant types may have a better way to solve this problem. We have several tables that keep numerical data that needs a running balance, with a total at the end. However, it is also necessary sometimes to pull a report which is a sub-set of the data in the total, say for a particular date range. The problem is if you restrict the data to a sub-set of the data, the balances will only be for the sub-set of the data. I have been solving this up to now by using a process which finds a subset of the data up to one day prior to the desired sub-set, getting the balance of that data, creating a new "opening balance" record with that data, and then running the report including this opening balance record. This works. However, there are always drawbacks on using "temporary" records, such as if the system fails in the middle of one of these reports, you now have data that does not belong there. Here is the original data set. Now if you do a sub-summary report sorted by account and date, you get this: So far, so good. However, when you filter the data by the "filter dates" shown in the original dataset, you end up with this result, which is wrong: By running a process that gets the previous balances of the accounts, getting their balances "pre-filter" date, and creating records, one can active the desired result: However, I am hoping there is a more elegant solution? I have attached a sample file with the data in the images. Thanks all! Balances.fp7.zip
  13. Hey LaRetta. I disagree. You worded the problem very well, pointed them to this forum, and gave them a sample file. You were more than clear. The answer they gave makes absolutely no sense. "Expected behavior"? Translation: "We know this problem exists and expect that bad result but have no idea how to fix it".
  14. Here is the script Ralph. Hope it helps you. ServerScript.pdf
  15. Strangely enough, I have been able to solve the problem completely by change the Find[Restore] to Enter Find Mode, Set Field..., Perform Find. This works perfectly so I have given up on the other approach as nobody seems to have an idea why that could be.
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.