Jump to content

Greg Hains

  • Content Count

  • Joined

  • Last visited

  • Days Won


Greg Hains last won the day on March 30

Greg Hains had the most liked content!

Community Reputation

4 Neutral

About Greg Hains

  • Rank
  • Birthday 12/26/1967

Profile Information

  • Gender
  • Location
    Brisbane, Australia
  • Interests
    Raspberry Pi, API development.

Contact Methods

  • Website URL
  • Skype

FileMaker Experience

  • Skill Level
  • FM Application

Platform Environment

  • OS Platform
  • OS Version
    High Sierra

FileMaker Partner

  • Certification
    Not Certified

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Comment, Perhaps I should have posted more information at the time - such as screen shots and maybe the file itself. My apologies if I muddied the waters at all. The solution you provided worked exactly how I wanted, but the error was mine in that I was not addressing the portals correctly. You still helped me resolve it. Greg
  2. Hi Comment, The error I made was refreshing the wrong portal name so the results never showed.
  3. Hi Comment. Problem fixed. I had applied the correct method all along, but there was a typo (doesn't matter how big or small though, does it?!) that had broken another part of the relationship. All working again. Thank you for your expertise once again. Ticket closed. Greg
  4. Hi Comment. The field with the command (fragment) for the filter portal is: t_Portal_Filter_By_Calc Which could contain text such as: "Product::QtyOnHand = 0" or "Product::QtyOnHand > 0" and then using that field in the portal filter command. Greg
  5. Hi Comment. I have a single Products table. Three of the fields in there are: Product Code (text), Description (text), and QtyOnHand (numeric). I also have a keyword search tool there - you can see from the PatternCount() command also there. I can display the various views, filters and keyword search results based on the same portal duplicated many times where the non "selected" ones are hidden but the portal filter command is straight forward but that is messy and not terribly smart. I figure that if I can parse all the smarts to the portal filter command then I can pretty m
  6. Hi Comment, I think I've misunderstood how that function works - I've checked the FM Help pages. If the text field that contains the text calculation I want to parse to the script reads "Product::QtyOnHand = 0" (without the quotes), when I Evaluate() it, the result is 1 - not literal text as I would like. How do I get the literal text out of that field and into the portal filter calculation please? Greg
  7. Hi Comment. Thanks for responding. The formula when "hard coded" in works fine, so use place the formula within Evaluate() I'll try that - thank you. Greg
  8. Good morning. I am trying to setup a portal for viewing products where there are a number of conditions by which I view this portal. This includes a keyword search and filtering by other criteria. The snippet below with the keyword search (second line), and the PatternCount() in the fourth line work fine, but the bit I am not getting to work is parsing the third line (in bold). The content of this line needs to be dynamic - so a field or variable that I point to. I know that if I put the actual condition text there (or True) it works fine, but not coming from a field or variable. I am
  9. Hi Comment, Great suggestion thank you - using Exact() - I will implement this. I will also use the Position() function as well. Many thanks as always. Greg
  10. Hello. I am experiencing difficulty trying to extract specific information out of a larger block of text. The block of text I have is several kilobytes long and a response from and IMAP server - the complete email - header and all. throughout this text are the words "date" and one instance of "Date". When I use MiddleWords or MiddleValue for "Date" I get all occurrences of the word - regardless of the case. I then started looking for the condition where the first character of that word was Char(68) - which is capital D - yet is still finds all occurrences of the word. (I tried using
  11. Hi BCooney, Tutorial on using the package makes a huge difference - no, I didn't watch it before (hangs head in shame). Thanks, Greg
  12. Hi Comment, Thanks! I will try this shortly and see if it "kicks in" for me. Cheers, Greg
  13. Hi Comment, I appreciate the time you have spent explaining this to me. I have always preferred being taught the method than just being given the answer ("Give a man a fish, and you'll feed him for a day. Teach a man to fish, and you've fed him for a lifetime.") and you have provided me with lots of information here. Working with JSON is starting to make sense now. lol Also, thank you BCooney. I tried this Generator tool recently but found it rather confusing to use as it split all the data up into different sections and you have to run the tool over each section. If you don't kno
  14. Hi Comment, Yes, there was an error in the text I posted here. Apologies. Your solution worked well all the same. At the risk of wearing out the welcome mat... Whilst: JSONGetElement( $$_JSON ; "list" ) JSONGetElement( $$_JSON ; "list[0].main.temp_min" ) retrieves that (minimum temperature data) OK, how do I access the data of weather ID please? It appears to be one level "deeper" within curly braces? I tried: JSONGetElement( $$_JSON ; "list[0].weather.id") and too many guessed variations of this but failed to nail it. Thank you again. Greg Sample.JSON
  15. Hi Comment, Thank you once again - that worked perfectly (not that I expected otherwise). My error with the extension of the supplied file - I've been working with both XML and JSON files, so I just used the wrong extension - long, long nights. With the extra comma in the supplied file, that result was what I got back from the API call. Verbatim. I might need to contact the service and check with them. Error (aka lack of knowledge) was still with me, but that would have confused me even more. lol. Thank you. I know your contribution to these forums is immensely appreciated. 😄
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.