Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 5199 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted (edited)

Desired result - Greatly speed up import of records in XML format from a url.

Currently - Importing records 10 @ a time from each XML response.... slow

Concept - import XML instead into a text field with continual rewriting and importing records at the end all at 1 time.

Problem is I'm not coming up with a solution on how to do this........

Any ideas?:

PS - headed to bed but will check in first thing.

THANKS !!

Edited by Guest
Posted

You should be able to load the XML result set into a global field (using WebViewer getlayoutobjectattributes function to get raw HTML), or use ScriptMaster, Troi URL, Moo Plug, etc), and concactenate all the XML responses into a second global field or variable. When the process is complete, the variable can be saved to a file (export field to file), and the file imported. Be sure to clear variable before first import. e.g. $This_Result = raw HTML from XML query followed by $All_Results = $All_Results + $This Result, followed by export field to file $All_Results

Is this for an Amazon ECS webservices query? I ask out of curiosity since it limits queries to 10 results.

Posted

It is..... I'm trying to find a faster way to update product info. Currently it takes a bit to long to import 10 at a time than I want. WWWwwwaaayyyy to long! LOL

I'm thinking it would be quicker to get the responses first and then do 1 import at the end.

... (using WebViewer getlayoutobjectattributes function to get raw HTML),

[color:red]I'm thinking this wouldn't be fast enough & I've had the WebViewer lock up my program a time or two.

or use ScriptMaster,

[color:red]I hadn't thought of this one... I'm not sure which function so I'll have to look at it again. I do use it to upload price changes at the moment.

Troi URL, Moo Plug, etc

Any other thoughts or suggestions since you know what I'm dealing with?

Thanks for the ideas if not!

Posted

I've be playing with this a bit ... in between learning a song to sing for a wedding today... and was wondering.....

Each response has a header/footer. Would you use a calc. to remove them from each response and then add them to the end result exported to a file for import?

Posted

I do not use the Filemaker import functionality, I just cycle back to the first of the 10 records, and programatically make the updates upon receipt of XML results. The first cycle through the list of 10 is to build the list of ASINs. I parse the XML tags programatically to obtain the pricing data. The XML results for the set, are stored in a variable and so the contents are retained across the 10 records. I wrote my own code to handle signatures, so the request is issued directly to the ECS servers.

This is quite fast -- indeed, it requires a PAUSE between successive requests, without which, it will exceed the maximum number of permitted requests per minute on Amazon's server. Obviously, the request needs formed differently for the last request which typically will contain fewer than 10 items. I think it handles 300 or 400 items/min.

I do not use the WebViewer for this purpose personally, because rendering the page is slow, and because it uses Internet Explorer and is buggy for thousands of requests, at least on a PC.

I've also implemented an API for Amazon Marketplace Webservices and Filemaker (inventory upload/report download, etc).

Posted

You may not recall but I bought the plug-in from you for the signature.... So I'm making direct requests also but using the import feature and my style sheet to do an import.

Anyway, interesting concept on the updating records. I never considering trying to do that instead of a direct import... which is actually a bit slow with the # of records I have. But I'm not quite following it all.

I haven't gotten into parsing data yet so I guess I have some learning to do on this stuff. But let me see if I understand you correctly.

1. Are you using the set field on a global text field to get the data? (I understand that fine if so)

2. Parse the field data...

3. After you parse the data on a group of 10 you set that to a variable ... local or global? ... adding each set of 10 to the variable?

4. Then at the end you update all or :

And why is it obv. that the last request needs to be formed diff.? I've not had a problem ... to my knowledge ... when it comes to the last few records during updating.

Thanks.....

Posted

#1: Yes, #2: Yes, #3: explained below:

First I build the ASIN list for the request; placing a comma after each ASIN, unless I'm on the last record in database, it which case no comma is suffixed. Then request is submitted, and results stored in a global variable or script level variable. Then I jump back 10 records using go to record/request/page script command (unless we are less than 10 records from end of DB) and iterate through each of the 10 records again to set the prices. POSITION function is then used to find the starting position of and ending position of for nTH occurrence of . This XML block is stored in a second global variable or script and all subsequent references are made on this variable which represents response for a single ASIN. Then I check that this second block contains the ASIN of the current record i.e. POSITION ($XML_Block,"" & ASIN,1,1)>1 (if it doesn't, there's no data or a mis-match, and we don't want to set any fields). If it is a match, POSITION and MIDDLE are again used to parse lowest new, used, and collectible prices and these are stored in the fields for the current record, then script proceeds to next record. After 10 records, it builds another list of 10 items and makes another submission. I only made this change about a month ago, it sped up data lookup tremendously vs one-at-a-time requests. A pause will also be needed on most PC's or to many requests/hour will be issued causing server to reject requests.

You might be right, I just assumed it wouldn't accept a comma after the last ASIN in a request, if no ASIN followed the comma?

Posted

I'm trying to digest all this but while I do let me tell you how I get my URL.

I have set up another table (table_: that has several fields. Here's some of them to give you an idea....

1. all_asins - calc. field = Substitute ( List (table_a::asin ) ; ¶ ; "," )

2. asins_to_update - calc. field = Substitute ( List (table_a::asins_to_update ) ; ¶ ; "," )

In #2 I used 2 fields in table_a. If I want to update this record I put a "Y" in text field table_a::update. table_a::asins_to_update is a calc. field using an - If (table_a::update="Y";table_a::asin;"")

Once I decide which list I want to run I have these fields:

table_b::all_text

table_b::first10

table_b::remaining

I then scripted to:

1. set table_b::all_text with the chosen list

2. reset it to = Left ( table_b::all_text;( Length ( table_b::all_text ) -1) )

3. reset again to = Substitute ( table_b::all_text ; "," ; "%2C" ) .... replacing the commas.

Now I have my list in correct form... Then...LOL...

1. set field table_b::first10 = Left ( table_b::all_text ; 127 )

2. set field table_b::remaining = Right (field table_b::all_text ; Length ( field table_b::all_text ) - 130 )

I use the field table_b::first10 in my calc. field that forms my request.

Finally I use the field table_b::remaining to reset field table_b::all_text and it loops until field table_b::first10 = 0 .... but I'm sure you can see all that... hey, I'm still just above novice!

Anyway that's the short version......... :

Posted

OY VEY !!! Do I have a headache.........

I'm playing with just a single record and some calc. fields using the patterncount and came across a problem.

I'm counting the 's to get new and used but if there's a it throws it off! It makes New the 3rd of to get the currency coverted version. I'd trying to use just ...example: 1742 or 699 ... and reformat them with a decimal but I'm missing something in my calc. ..........

Posted

I get the three prices separately. "" , "", "" and divide results by 100 to convert integer price to USD (US Dollars). I can't recall the reason and don't have the XML in front of me, but I elected not to use .

I use a subscript to extract XML data, but a custom function could be used equally well. The subscript returns the value. If you use the tags above, and operate on a single ASIN at a time (e.g. on a text block between nTH and tag for item #n), they will be unique. I think there are some XML functions on Brian Dunning's site such as ExtractData that may prove useful.

Posted

"Now just where DID I put that forest?"

Sad...Sad...Sad...LOL

And here I wrote a 4 page calc. to get it!

I've done some testing on a runtime and think I have the gist of it. For some odd reason though a calc. field I set up in my other database is constantly refreshing when I add a scriptmaster geturl field like I have in the runtime..... but I'll get it.

Posted

So my thought ... simpler for me.. was to pull the to section and keep adding to a global field and I've got a field pulling the info.

I then tried to export that with:

table::xml_header_field & table::xml_itemdata_field & table::xml_footer_field ... saving as xml.

However when I try to import with my same old stylesheet all I get is the field names ... no data.

I'm sure it's in the header that's the problem but any ideas?:)

Posted (edited)

Can you post the header and footer and body of what is being exported? From what you're describing, the global field contains XML already (joined from successive requests of 10 items each), and thus, should be exported to text file NOT as XML. Also, the method previously described (programatically parsing each item) should be so fast, that you have to insert delays so as not to exceed the permitted number of Amazon queries/minute. What sort of speeds are you achieving?

Edited by Guest
Posted

Sorry for taking so long to reply. I had to step away from everything for a few days to go see my son and to get a fresh look.......... anyway.

Can you post the header and footer and body of what is being exported?

[color:red]Upon returning I saw the problem... sadly I've made it before... I had the wrong version in the header. It wasn't 2009-10-01

... Also, the method previously described (programatically parsing each item) should be so fast, that you have to insert delays so as not to exceed the permitted number of Amazon queries/minute.

[color:red]Thanks for the reminder but I didn't forget.

....What sort of speeds are you achieving?

[color:red]So anyway I've been testing with just the standard 10 records but I'll let you know before long.......

EDIT :)

My script is bogging down and I'm not sure why. I'm using a loop and have cleaned it up a bit so that it's all in 1 script.

table::list

table::asins

Both are global...... wondering if that's the problem ??


.......first part to get my list....

Set Field [table::List.....]

Freeze Window  (Doesn't seem to work for me...)

Loop

  Set Field [table::asins; If(Length(table::list)>127; Left(table::list; 127; table::List)

  Set Field [table::List; ....

  Set Field ( set's up the query)

  Set Field ( set's up the query)

  Set Field ( set's the url)

  Set Field ( table::items text; table:items text & table::items calc ) Puts a calculation field with the  section into a text field...

  Pause/Resume Script [Duration (seconds): .5]

  Exit Loop If (length table::List<= 0)

End Loop

.........



Posted

Update....... I finally switched methods due to ... well pretty much everything.

End Result:

1. Created a new table to hold the fields I want to update on a regular basis... i.e.: pricenew, priceused & salesrank (including asin field for #2)

2.Created a relationship with asin in the primary table and new calc fields set to the matching fields in the new table.

3. Running script is now an xml import creating new records each time in the new table which is much faster. (First attempt was 628 records/min... a bit to fast)

4. Set pause to .5, .25 and then .05 trying to gt just under 600 but it always seems to end up around 400/min. I need something to slow it just a fuzz but not that much so I'll keep playing.

Anyway, overall a much easier option than what I was trying. However, I did learn a lot more about working with FMP during the process ... so worth it.

Thanks all!!

Posted (edited)

I'm glad you have it working at full throttle.

One other change you might consider is having the script dynamically adjust the pause. If it initially uses a long pause for the first minute or request, and then counts how many requests/min are made, it can then dynamically calculate/adjust the pause for maximum speed. That may also help it run within the request limit even on a faster PC, and adjust for changes in Amazon server response times.

Which script step(s) were bogging it down before? From your message it sounded like it was matching to the old table, out of curiosity, how many records do you have any what is DB file size?

I run my repricing on an old P4, but I only have 40,000 records.

Edited by Guest
Posted

I'm glad you have it working at full throttle.

One other change you might consider is having the script dynamically adjust the pause....

[color:red]LOL... yeah.. a bit beyond me. Anyway... any pause I add, fraction or otherwise, slows me to 400/min on my laptop, which is an older 1.6 celeron. I'm curious to try it on my main pc

Which script step(s) were bogging it down before? From your message it sounded like it was matching to the old table,

[color:red]Yes the matching/import was it. I started this database back on FMP V5 I think... maybe 7. It needs a LOT of cleaning up !!!

out of curiosity, how many records do you have any what is DB file size?

I run my repricing on an old P4, but I only have 40,000 records.

[color:red]I have almost 40,000 at the moment and not sure of the file size since it's on my main pc and I've been working on the laptop, in a program I created similar to ASellerTool, while I work out the bugs. It has a table of my listings quantities so when I'm shopping I don't buy 50 copies of something I'm not moving....

If it's OK send me a pm and I'll update you as I go along instead of posting useless stuff here for everyone.... or whatever.

This topic is 5199 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.