Jump to content
Server Maintenance This Week. ×

XML importing many or multiples of the same data field


Cabinetman

This topic is 6066 days old. Please don't post here. Open a new topic instead.

Recommended Posts

I thought maybe I better start fresh but this does relate back to the Post:

Importing from Amazon Web service with the ISBN (Topic#185162)

That sounds right, a table for each of the "many" entities; especially things like author. You do not really need "allow creation" on, because you're going to be importing all the data, not entering it. You would need to import the main table first, capture the FileMaker serial ID, then Replace it into each child after importing.

Or, alternatively, using the ID of the item to tie them together, importing it with each child. The way to do that is to reach "up" the hierarchy with the xsl, using the syntax "../" to denote "up a level" in the path.

Your original xml you posted is slightly different from what you're talking about now; is it just different because that was a book and this is a movie? What is the xml now, or what command are you using to get it (and we'll get our own :(-).

I can't thank you guys enough for all the posts I've read on XSL and Amazon and also Calculations.

Using this info I've been able to set up my own Listing/Catalog database for inventory (mostly books and some CD's/DVD's which I sell on several sites) using Amazons AWS for product info.

I've been able to set up my image download to put a face on a title... so to speak. But I had to use a plugin ($) since I'm Windows XP. I wish I could create my own but maybe in time.

My only other desire is just as this person has... Multiple returns of certain data, i.e. -Actor, Format, Languages, etc. and how to import those multiples.

I know that tables are the best way to go but it's more work than I want to do. I really just want a quick, down & dirty way to get it in for reference.

I don't have much need for this info except to verify the product. No need to export or actually do anything with it.

Isn't there a simpler way to get....

-

Alyssa Milano

Holly Marie Combs

Rose McGowan

....into 3 fields like actor1, actor2 & actor3 or even a repeating field?

Link to comment
Share on other sites

Personally I would never do "actor1, actore2,..." for several reasons. But I can see that you might sometimes want to put them all in one field (I wouldn't, being a relational fanatic, but you might :(-).

In that case you can just process them within the

element. I think the Unix return would work on Windows also.


	

		

			



		

	

Edited by Guest
Link to comment
Share on other sites

No, not repeats, it imports as multiple lines. Which, as far as I'm concerned, is better than repetitions. But if you really want to try for repetitions, the separator is ASCII 29. It may work. It may crash and burn :(-|

Huuumm not quite working right.

If I go step by step when I look at the data in the 'Format' field during the import process it is only showing the first row of data... If I look at the info in my browser I see this.....


−

   Ellen Pompeo

   Sandra Oh

   1.78:1

   NR (Not Rated)

   DVD

   0786936300451

   Closed-captioned

   Color

   Dolby

   DVD-Video

   Widescreen

   NTSC

   0788861948

   Buena Vista

... But only the 'Closed-Caption' looks to be coming in at the FileMaker Import Mapping section.

Edited by Guest
Link to comment
Share on other sites

Did you use , like the Actors? Because there's multiple format elements.

Maybe you should post your xsl file for us to look it. And the xml file also, if you've got it. Please zip it, then upload it. Otherwise the parser here will mess it up a bit (as you can see by looking at the posted code; it adds spaces, translates less than symbols (randomly?), translates line endings in , etc.)

Link to comment
Share on other sites

Don't laugh to much at me... Just remember I have no programming, XML or even FileMaker training. I started at FMP Ver.3 building a database for medical billing and now into Ver. 7 and online selling.

Here's what I've built thus far... LOL

Edit: Maybe it's the fields on the import side. After thinking I don't know how the 'FileMaker Import Mapping' section could show more than the first.

amazon_info_search.zip

Edited by Guest
Oops forgot the xml... If already download I've updated it.
Link to comment
Share on other sites

Yes, if you mean what data does it show when you're flipping thru looking at what's going to Import, it's only going to show the 1st line. I think that's always true. But this is what it returns:

Closed-captioned

Color

Dolby

DVD-Video

Widescreen

NTSC

Link to comment
Share on other sites

Well you weren't laughing to hard to type. That's good to see.

So I guess it's me when importing. I'm not sure how or what to do with the 'many' to split it up.

...or if anyone who knows and can tell me.

Thanks

Edited by Guest
Link to comment
Share on other sites

I don't get what the problem is. It works, the formats come in as multiple lines. BTW, I get all the data you're importing using: &ResponseGroup=Medium, instead of "Heavy". It's a heck of lot smaller file. Heavy has lots of reviews, as well as similar products, browse nodes, list mania, etc..

This is the FileMaker file I get.

ams_ASIN.fp7.zip

Link to comment
Share on other sites

Mostly my ignorance............ seriously

I don't have my text field set correctly I guess and didn't see the import. I know it sounds dumb but I have my text aligned top/left and It looks like this (take out the periods I used as spacers) based on field width it actually dropped farther down in mine:

Closed-captioned

......Color

......Dolby

......DVD-Video

......Widescreen

......NTSC

Instead of like yours:

Closed-captioned

Color

Dolby

DVD-Video

Widescreen

NTSC

What the heck am I doing? It imports with 7 tabs between each line............

Link to comment
Share on other sites

Well, I don't really know. I'm on a Mac, and it could be a slight difference in the xml parser (though I thought it used the same one on both platforms). But sometimes you do get whitespace in text returned. Usually it's because it's there in the source data.

Probably the easiest thing to do is to define that field to use an auto-enter, by calculation, with [ ] Allow replace of existing data unchecked; and just use Substitute() to remove the extra stuff. If it's really tabs (you sure it isn't returns?), you're going to have to type one somewhere, then copy/paste it into the calculation dialog. In other words, post-process the data to remove extra whitespace.

You could also try putting full Windows returns in the xsl, instead of the Unix ones.


Edited by Guest
Link to comment
Share on other sites



So I played with the XSL a bit because as I look at your post I see the code has & # 13 ; & # 10 ; (I guess the carraige return?) between the xsl:text and /xsl:text. Yet when I reply it looks just like it should. Anyway I found this..... If I hit the carraige return after to drop to the next line so that it looks like yours there are 7 tabs between them and it comes out like I showed. If instead I backspace 6 times all is right in the universe.......... go figure. Oh and it looks like this.

I don't know how you find time to help us but again .. THANKS!

PS:

1. As for the ='Large' I'm just keeping options opened for down the road. May need the similar products.

2. Now if I can just get my own plugins that doesn't cost for downloading Images and SMTP this may be able to help someone esle.

                       

Link to comment
Share on other sites

Yes, that code within the tags is for returns. The 2 of them are for Windows-specific returns. It's hard to post the darn things to the forums. The PHP engine insists on translating them to returns, even if you surround your code with code brackets. If you look at the code I posted it has it properly, as it would be in the xsl. To make it do this I have to post it once, and the PHP translates them. Then I have to Edit the post, put the real code in again, and resubmit. The 2nd time it does not translate them. I have no idea why it works this way but it does.

Link to comment
Share on other sites

I don't know about SMTP on Windows, but you can get Wget for free (it's available for several platforms). It is like curl (which is built-in to Mac OS X, so is commonly used; curl is also available for Windows, but I found Wget easier to figure out). They both download web page source.

You can run Wget on Windows XP with the free Abstrakt Shell plug-in. There is also a newer plug-in, Spyder, which was distributed in the Sept. newsletter by fmwebschool. But I don't know how you'd download it; hopefully Stephen will put it on a web site somewhere.

It is fairly easy to use these tools, for this anyway (there are many things they can do, but this is simple).

Link to comment
Share on other sites

I hate to ask because everyone's (you've) been so helpful.... but I'm throwing this one out anyway.

I'm stumped on this type of xsl to import multiples of that are in - into one or multiple fields. Similar problem but different............

The response is:

True

−title

ItemIds

7

1

0143034758

B000OJ6BBO

B000OPHBX0

B000OJA97G

It's simple I'm sure but after 3 hrs of trying different things I just can't get the XSL right.

This is my latest attemp to find a solution....

or

<?xml version="1.0" encoding="utf-8"?>



xmlns:amz="http://webservices.amazon.com/AWSECommerceService/2005-10-05" exclude-result-prefixes="amz">



	

		

			0

			

			

			

				

			

			

				1

				

					

						0

						1

						

							

								

									

							

						

					

				

			

		

	

Edited by Guest
Link to comment
Share on other sites

It's because your is above the level of the multiples. You do not want multiple data, you want multiple data.

So it's not:

It's more like:






[You might also write with the Item on a higher level, if there were several elements like inside an group. But in every case you'd want an for the ones you wanted to get multiples of, into one field.]

Edited by Guest
Added Request
Link to comment
Share on other sites

I'm still getting an XSL Parsing error...........

This doesn't look right somehow.

This is the same code as if all the ASINS are in the same response (Like with the ) isn't it? Yet they are in different responses......

DVD

Closed Caption

but it's

0143034758

B000OJ6BBO

B000OPHBX0

...You do not want multiple data, you want multiple data.

Actually I think I do. There are multiple (s) under each with their own

Edited by Guest
Link to comment
Share on other sites

Well me too............ but I'm still missing something and don't think it needs to be in there.

Isn't the group 'Request' closed and 'Item' is off 'Items?'

I see the full path of each ASIN as being:

amz:ItemSearchResponse/amz:Items/amz:Item/amz:ASIN

 Spaced as best I could....



−

−

 −

   

  

  004HEP4HQDZJH7Y5BQ1M

 −

   

   

   

   

   

   

  

  0.0570619106292725

 

−

 −

   True

  −

   −

title-begins:Alexander Hamilton and Author:Ron Chernow

    

    ItemIds

    Books

   

  

  7

  1

 −

   B000S8W9NA

  

 −

   B000OJAQAG

  

 −

   0147501555

  

 





I have............ but I still get an error





<?xml version="1.0" encoding="utf-8"



xmlns:amz="http://webservices.amazon.com/AWSECommerceService/2005-10-05" exclude-result-prefixes="amz">



    

        

            0

            

            

            

                

            

            

                1

                

                    

                        0

                        1

                        

                            

                            	

								

						

 

        						

                            

                        

                    

                

            

        

    

 

Edited by Guest
Link to comment
Share on other sites

It is difficult to get this right unless you post exactly the xml you get, all of it. It would be best if you did this in zipped text files, so we don't have to remove the crap that Internet Explorer adds (dashes, non-breaking spaces, etc.). If you don't have an xml-capable text editor, you should get one. It makes life simpler.

The problem is that all the xml files we were dealing with before had an Amazon namespace included in the root element. Example:

The xml code you are posting now does not.

The xsl works if the namespace is included:

If it is not actually included (but I think it is), then you can remove the "amz" stuff, which I added, and the xsl will work also.

Link to comment
Share on other sites

It is difficult to get this right unless you post exactly the xml you get...

The problem is that all the xml files we were dealing with before had an Amazon namespace included in the root element. Example:

The xml code you are posting now does not.

They do .....if I don't copy from I.E.

The xsl works if the namespace is included:

If it is not actually included (but I think it is), then you can remove the "amz" stuff, which I added, and the xsl will work also.

Here are the 2 XSL/XML.

amazon_info_search.xsl with amazon_info_search_small.xml (I actually use large as you know) is fine.

the other is my problem.

I noticed a problem and replaced the file amazon_info_search_poss_asins.xml

Edited by Guest
Link to comment
Share on other sites

I don't know why you got that error. It could be an "encoding" error. But I don't think so. It's usually worded differently.

It could be that you have the xsl file locally and are trying to access the Amazon response via web services. That would often give a similar error. Though on my machine it would be a "socket" error. Once again, a slight platform difference in the wording of the error message; I'm mostly on a Mac. And I don't see that error much anymore, but I'm running FileMaker 8.5.

But basically if it was ever working before, with the setup you've got, then it should work again. You should restart your machine and try again.

Back to xsl. It is a logical error to say:

1.

and then later say:

2.

Because after the 1st one you are ALREADY within the Item node. Asking for it again once you're inside it is not going to match anything. It's like trying to go through a door into a room you're already in.

And there's no really "right" way to do it in all cases. In other words, you could remove either the 1st Item, or the 2nd, and it would work. Which you'd do would depend what you were trying to get. If you remove Item from #2 above, you get a separate FileMaker record for each ASIN, which is relationally correct, but not what you want. So you should remove the Item from #1 above, only have it in #2. Then you get what you want, multiple ASIN's in the same field, in the same record.

1.

2.

Link to comment
Share on other sites

But basically if it was ever working before, with the setup you've got, then it should work again. You should restart your machine and try again.

The one XSL - info_search - has and does work fine.

The second file - info_search_poss_asins has never worked yet. [/b]

This now works fine....

<?xml version="1.0" encoding="UTF-8" ? >



  

  

    

			0

			

			

			

                

            

            

                1

                

                    

                        0

                        1

                        

                            

                            	

									

						



        						

                            

                        

                    

                

            

        

	

Edited by Guest
Link to comment
Share on other sites

The utf-8 is not essential. If you're having trouble remove it. It's just what I keep as a standard for xml declarations; not required. In fact, you can copy/paste the code without it into a new document on your machine. Until you get a decent text editor no one can help with encoding problems, because they are invisible. What you can't see can hurt you, unfortunately. I have similar problems sometimes.

Link to comment
Share on other sites

First a special shout out to Fenton! Thanks for all the help!! I'll go through and edit all these post and put up the XSL's for others when I get a chance...

UPDATE

I finally have it ...almost!

So after trying some things I came to the conclusion...... XSL is ok after adapting.

Problem? Seems to be the link used to call the XML.

In a search with the title it had a space in between the words. So the http request failed.

If I copy it from the browser address window and paste during the script it adds the % 20 between the voids and BAM!

So what calculation can I use to get a field with 'John Lennon' to come in as 'John % 20 Lennon'?

The saga continues yet more.....

Anyway here's 2 XSL files for searching Amazon.

URL for info search - by ASIN

http://ecs.amazonaws.com/onca/xml?Service=AWSECommerceService&AWSAccessKeyId=[yours]

&Operation=ItemLookup&ItemId=[itemid]&ItemType=ASIN&ResponseGroup=Request,ItemAttributes,Large

URL for title & author search - returns possible ASINs to list under

http://ecs.amazonaws.com/onca/xml?Service=AWSECommerceService&AWSAccessKeyId= [yours]

&Operation=ItemSearch&SearchIndex=Books&Power=title-begins:[title]%20and%20Author:[author]

&ResponseGroup=ItemIds

Edited by Guest
Link to comment
Share on other sites

Tuesday AM:

Still working on updating my XSL and trying to add data.


EDITED FOR CONTENT....

−

 −

  −

    Ellen Pompeo

    Closed-captioned

    0788861948

  −

  −

 −

  −

    John Smith

    Hardcover

    1188861948

  −

  −





I want to get everything in just 1 record and each into 1 field. All of  into  field1,  into field2,  into field3.



So it looks like:





Ellen Pompeo

John Smith









Closed-captioned

Hardcover









0788861948

1188861948









Is this called grouping or what?



I'm working with different fields but trying the for-each  with each one:





                        

                            

                            	

									

						



        						

                            

                        

                        

'need /data tag'            

                            	

                            		

						



        						

                            

                        

But I get an error saying I need an tag in the place I show above....

Edited by Guest
Link to comment
Share on other sites

You prematurely closed the 2nd . That line show not end with "/>", it should be ">"

But really, it should not be a "for-each" at all. Because there's only 1 SalesRank isn't there? I can't see it, so I don't know.

If so, it should be just "", which would end with "/>". And it doesn't need the text carriage return if there's only 1 thing.


Probably either would work, but if there's only 1 to get, then "for-each" is overkill, like doing a Loop in FileMaker to get values from 1 record.

Edited by Guest
Link to comment
Share on other sites

Acutally there will be a sales rank for each ASIN.

If the search returns 5 possible ASINS for an book called Alexander Hamilton

I want 1 record to import with the multiples in each field... like:

ASIN Rank Format

12345 1000 hardcover

12346.... ....softcover

12349 1255 audio

That way I can compare before listing... odd I know but some ISBN's will be listed with Multiple ASIN's yet be the exact same item.

One may be ranked 1 million and the other ranked 1 thousand because when searched by title it pops up first..... which is something I've come across.

That fixed the error but only the ASIN's are coming in.... I'll have to play some more to figure out why.

Edited: maybe because sales rank isn't under Itemattributes! UGH!!

Edited:

Now to find out how to insert a carriage return if that instance is empty..... as somtimes happens.

Edited by Guest
Link to comment
Share on other sites

This topic is 6066 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.