Jump to content

Filemaker generate 30Go of memory ans slow after 400 record loaded


Recommended Posts

Hi 

I have a 5 000 record with url to check on web viewer (loop > open url in web viewer ... check source code, go next) 

After 400+/- records it s slow down and in the end not respond at all. 

I went to check monitor activity and I see 30Go. 

If I close the file it s keeps the 30go and keep slowing whole Mac, but if I quite filemaker and open again it s worlds well until 400 more records ... 

 

anyone had to deal with such issus ? 

 

Screen Shot 2020-12-11 at 16.37.57.png

Link to post
Share on other sites

thank you Wim, I did some research and it s seem s to be an issue, 

I found out that web viewer make a memory leak and the best thing to do is to have two windows, one for webviewer and one for yours fields, then after each load reset the webviewer and wait 1 sec. 

  1. "Set Web Viewer [Object Name: "wv"; URL: ""]
  2. Set Web Viewer [Object Name: "wv"; Action: Reset]
  3. Pause/Resume Script [Duration (seconds😞 1]
  4. Close Window [Current window]"

credit to "keeztha"
it was 4 years ago maybe someone sorted it out since ^^

Result I keep one window and just reset and wait 1 sec  
I use 3 time less memory and went to 800  records I m sure it will go below but will still increasing and froze so I keep looking

Screen Shot 2020-12-11 at 18.57.24.png

Edited by ibobo
Link to post
Share on other sites

finally the best thing to do, 

 

parameter 1 window, 1 web viewer, 1 portal. 

script :

-loop

- go to portal, select nem record, set web viewer to the url (from portal)

then

- reset webviewer  wait 2 sec , go to portal nem +1 

-end loop

 

with that I can go  up to 5000

Link to post
Share on other sites

I m checking prices of products on Amz, if I use curl, website see it, and I don t want to deal with their api. 

since I reset every time now the issus is captcha that I didn t had before XD 

 

Link to post
Share on other sites

Pretty sure that you're violating some sort of Amazon EULA by doing web site scraping.  So be careful that you don't end up on their blacklist.

They have APIs for this and they're not that hard; if you don't want to deal with them, find a local developer that you like and trust and get them to integrate for you.  Your solution will be a lot more stable, performant and scalable.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.