Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 1438 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted

Hi 

I have a 5 000 record with url to check on web viewer (loop > open url in web viewer ... check source code, go next) 

After 400+/- records it s slow down and in the end not respond at all. 

I went to check monitor activity and I see 30Go. 

If I close the file it s keeps the 30go and keep slowing whole Mac, but if I quite filemaker and open again it s worlds well until 400 more records ... 

 

anyone had to deal with such issus ? 

 

Screen Shot 2020-12-11 at 16.37.57.png

Posted

That's an old version of FM there :)

In your loop, do a pause of a few seconds every 50 or so iterations.  See if that keeps the memory down.

Posted (edited)

thank you Wim, I did some research and it s seem s to be an issue, 

I found out that web viewer make a memory leak and the best thing to do is to have two windows, one for webviewer and one for yours fields, then after each load reset the webviewer and wait 1 sec. 

  1. "Set Web Viewer [Object Name: "wv"; URL: ""]
  2. Set Web Viewer [Object Name: "wv"; Action: Reset]
  3. Pause/Resume Script [Duration (seconds😞 1]
  4. Close Window [Current window]"

credit to "keeztha"
it was 4 years ago maybe someone sorted it out since ^^

Result I keep one window and just reset and wait 1 sec  
I use 3 time less memory and went to 800  records I m sure it will go below but will still increasing and froze so I keep looking

Screen Shot 2020-12-11 at 18.57.24.png

Edited by ibobo
Posted

finally the best thing to do, 

 

parameter 1 window, 1 web viewer, 1 portal. 

script :

-loop

- go to portal, select nem record, set web viewer to the url (from portal)

then

- reset webviewer  wait 2 sec , go to portal nem +1 

-end loop

 

with that I can go  up to 5000

Posted

I m checking prices of products on Amz, if I use curl, website see it, and I don t want to deal with their api. 

since I reset every time now the issus is captcha that I didn t had before XD 

 

Posted

Pretty sure that you're violating some sort of Amazon EULA by doing web site scraping.  So be careful that you don't end up on their blacklist.

They have APIs for this and they're not that hard; if you don't want to deal with them, find a local developer that you like and trust and get them to integrate for you.  Your solution will be a lot more stable, performant and scalable.

Posted

Thanks Wim, yes you re right about stable en performant,

next step ))) 

 

This topic is 1438 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.