Cabinetman Posted June 25, 2009 Posted June 25, 2009 Example: 1234567891234,12345,1234567891235,1234567891236,23456,1234567891237,345678,1234567891238,45678 I need to delete any that are less than 13 dgits so that I have 1234567891234,1234567891235,1234567891236, etc....... I'm not that good with calc's or much for that matter.
comment Posted June 25, 2009 Posted June 25, 2009 (edited) It would be easier if they were separate records. The way you have it, it's either a custom function (requires the Advanced version) or a script. How many of these do you expect to be there? Edited June 25, 2009 by Guest
Cabinetman Posted June 26, 2009 Author Posted June 26, 2009 (edited) Internet went down last night... sorry. 1. Anywhere from 20 - 50 or so. 2. I currently use some custom fucntions. In short it's a scan that is picking up 2 bar codes. After scanning a bunch the hookup puts the memory into a text field and runs a script for me.... The second set isn't always there but if it is should be a constant 5 #'s...I think. The most important is ALWAYS 13 characters. Edited June 26, 2009 by Guest
comment Posted June 26, 2009 Posted June 26, 2009 I currently use some custom functions. You can use custom functions in any version, but you need Advanced to install one. Since you'll be running a script anyway, it doesn't really matter. Have your script do something like: Set Variable [ $codes ; Substitute ( YourTable::Inputfield ; "," ; ¶ ) ] Set Variable [ $i ; 1 ] Loop Exit Loop If [ $i > ValueCount ( $codes ) ] Set Variable [ $code ; GetValue ( $codes ; $i ) ] If [ Length ( $code ) = 13 ] # DO SOMETHING WITH THIS CODE End If Set Variable [ $i ; $ + 1 ] End Loop
Cabinetman Posted June 26, 2009 Author Posted June 26, 2009 (edited) I didn't notice that it didn't show I had Advanced.... I was searching thru Brians custom functions for something I thought I could make work that might be easier or quicker. Gonna give this a try . Thanks! BTW - I'm guessing the #Do something part would have to be like inserting it into another field and adding back the comma so I end up with a list that only has the 13 digit #'s in there? Edited June 26, 2009 by Guest
comment Posted June 26, 2009 Posted June 26, 2009 Well, I thought you wanted to do something with the codes that pass the test, like create records for them or something - so why not do it all at once? If they're meant to just lie there in a field, why bother filtering them?
Cabinetman Posted June 26, 2009 Author Posted June 26, 2009 After I get the string correct they are then pasted 10 at a time into another field and new records ARE created from the list via an XLM dowload... The first 10 are then deleted from the list and the next 10 inserted until none remain. I'm just needing to get this first input corrected so I can complete my script... so maybe the script is best (?) I think I like the CF idea a little better even if the script may be easier for me..... I've never been about 'easier' though. I still like to learn something as I go and I haven't fooled with CF's in a year or so... But the final decision is what is BETTER??
comment Posted June 26, 2009 Posted June 26, 2009 (edited) I am a little confused: I don't see what's the significance of the number 10 here, and I don't know what's "an XLM dowload" and how it's used to create records. I believe a script would be more appropriate here because this way the entire logic of the process can be in one place. --- BTW, if you program the scanner to insert carriage returns in-between the values instead of commas, you could skip the Substitute() part. Edited June 26, 2009 by Guest
Cabinetman Posted June 26, 2009 Author Posted June 26, 2009 (edited) Example using 10 digit ISBN instead of 13 digit EAN: http://webservices.amazon.com/onca/xml?Service=AWSECommerceService&AWSAccessKeyId=**********&Operation=ItemLookup&ItemId=1569754519,1577315545,0786718579,0881791210,0884863123,0881792063,0441015905&ItemType=ISBN&ResponseGroup=Request,ItemAttributes,Large I can only request up to 10 at a time.. this example has less but anyway... It returns to me XML that I can import to create new records or update existing - which I currently do. I scan about 50 books at a time and the scanner when plugged back in pastes them into the 1st field. from there my script puts 10 at a time into a global field and gets the XML and creates the new records.... I have used the CF http://www.briandunning.com/cf/445 to get what I'm looking for....but stupid me can't remember how to get the last comma out! remove_eans_from_string.zip Edited June 26, 2009 by Guest crude example....
comment Posted June 26, 2009 Posted June 26, 2009 I see. Why don't you try something like: Set Variable [ $codes ; Substitute ( YourTable::Inputfield ; "," ; ¶ ) ] Set Variable [ $i ; 1 ] Loop Exit Loop If [ $i > ValueCount ( $codes ) ] Set Variable [ $code ; GetValue ( $codes ; $i ) ] If [ Length ( $code ) = 13 ] Set Variable [ $requests ; $request & GetValue ( $codes ; $i ) & ¶ ] If [ ValueCount ( $requests ) = 10 or $i = ValueCount ( $code ) ] Set Variable [ $URL ; "http: ..." & Substitute ( $requests & ¶ ; [ "¶¶" ; "" ] ; [ ¶ ; "," ] ) & "&ItemType=..." ] Import Records [ -- use $URL here ---] Set Variable [ $requests ; "" ] End If End If Set Variable [ $i ; $ + 1 ] End Loop I would still use a carriage return as the initial separator, because it would make things easier (Filemaker cannot count numeric values separated by a comma).
Cabinetman Posted June 26, 2009 Author Posted June 26, 2009 opinion on which is better... script or CF as I did??? I haven't incorporated either yet.........
comment Posted June 26, 2009 Posted June 26, 2009 opinion on which is better... script or CF as I did??? I believe a script would be more appropriate here because this way the entire logic of the process can be in one place.
Cabinetman Posted June 27, 2009 Author Posted June 27, 2009 I went with the CF.... It seems faster which is more important to me at the moment. I plug in the scanner, it downloads all the scans, the calc field only shows the 13 digit EAN's and that field is instantly put into the main text field to run the script of importing new records 10 at a time. Thanks for your help and guidance. I'm keeping the script for reference later.
Recommended Posts
This topic is 5687 days old. Please don't post here. Open a new topic instead.
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now