Jump to content

fmp7_user_fmp7fmp7

Members
  • Posts

    29
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

fmp7_user_fmp7fmp7's Achievements

Contributor

Contributor (5/14)

  • First Post
  • Collaborator
  • Conversation Starter
  • Week One Done
  • One Month Later

Recent Badges

0

Reputation

  1. This DB is for my ease, and does not need to hold the records for any attendence or stat need. I spend hours arranging this text books, notices, tests, ect. What happened is that a test for student named Allice smith went in the right box but after a long day at at school grading papers I mistakenly put Alice Smithe in Allice smiths box too. The two Allices got differnt grades so this caused alot of heart ache for me and for Alice. So if I was just serial number instead of a name for me to pattern match then for example if Allice smith is serial number 524 then all her items go into box 524 there would be alot less human error. Issue two the serial numbers. This is a physical issue I only have 100 boxes so I cant have unlimited serials.
  2. I need this to arrange their books, tests, quizes, painting supplies, ect.
  3. I have a database of my students. Each student is in the database for a few clases. Every two weeks 90% of my students move on and I get new ones. The students who move on are deleted from the DB. What im trying to do is set up a unique serial number for each student so that for each class there is the same unique serial number.So for example: name class serial record 1 Jim Hitson math 1 record 2 Jim Hitson History 1 record 3 Jim hitson Spelling 1 record 4 Tiala Johnson Math 2 record 5 Tiala Johnson History 2 ect Now comes the hard part how do I get this serial number to reset every two weeks without messing up the serials for the students who do not move on. So in the above example Tiala Johnson would still have a serial of 2 but since Jim hitson has moved on then the next new student would get the serial of 1. I have to have the serials reset so that there is no serials that are over 1000 in number, I cannot change this or I would just use a regular auto enter serial. Thank you for reading and any help would be great!!
  4. My database has anywhere between 45,000 records and 350,000 records( it changes every day). I need to be able to export 2000 records at a time until all the records have been exported so the way I see is: 58 exports of 2000 records would work for a record set of 116,000 This is going to take me a long to in the scriptmaker to type. Is there anyone out there who knows a faster way of doing this. I took a screen shot of code from script maker so you could see my logic, by the way count1 is just get record count divided by 2000. Please help me! Picture_6.pdf
  5. The fields in the next database are as follows: record1 isbn_1 isbn_2 isbn_3 where isbn is the actual isbn text. The purpose of moving the data there is to set up ordering for each individual isbn. This part of the process I cannot change, so i am trying to figure a quick way to set up the isbns in the attached database to sort the data by price and isbn and then number each accordingly. This numbering(1-3) will be the match field(isbn_1, isbn_2, isbn_3) with the next database so that an import can occur. If you look at the script sku1 you will see what I have come up with but this script takes hours and hours of time to run. I am looking for a more simple way of doing this task.
  6. Yes that will sort the records the correct way but my problem is how to import them into the next database. What I did to do this was create a script Sku1, that adds a 1 if the location has the lowest number of items, and adds a 2 for the next, ... This sets me up to have isbn1_1 isbn2_2 isbn3_3 I need to be able to use the above three fields to match their partners for import into the next database. The only problem is that it takes a really long time for the script to run through all the data. I thought about using a portal but I need to be able to create a field that is the match field for import and I cant get it to make a field in a portal. Does that make sense?
  7. I have a database with the following fields: isbn location qty For each isbn their are 3 locations each with different quantities. This geographic info needs to be imported into another database by quantity. So that the lowest quantity location is used first. I have a scripting solution that works but it is very, very, very slow. Can anyone take a look and see another way of doing it? location.zip
  8. Perfect! I was so trapped with potal/conditional value lists that I could not see how it could be done any other way. Thanks
  9. How do I get only 3 records to be displayed in the field? I get every value of every record for the field and all i need is the first 3 then the next 3 and so on untill the records end.
  10. I need to take a certain field(upc) from every three records and be able to export each block of three upcs as one record so the export would look like: line 1: 9999902244951, 987741000523, 982693000010 line 2: 724354004124, 724354004025, 724354003929 line 3: 606949007324, 606949007126, 606949006525 I know this may be done via a portal but how do you export portal data? Any help would be great!
  11. I have historical data on our schools races, for each day of competition in a race there are: Student name date lowest time highest time start run time end run time number of runs I dont know how to relate this data for multiple days for each student? I have one record per student and all the calculations set up but I am lost on how to join day 1 times, to day 2 times for a student so that I can compute the percent increase. I have attached the file so you can see what I have done. Any help would be great! school_st.zip
  12. Take a look st: "RUN APPLESCRIPTS ANY TIME OF THE DAY OR NIGHT OK, so you've you've created some AppleScripts that do cool things like run your backup, download your email and update your stock quotes. But you've got better things to do than sit in front of your computer, running a script whenever you want those cool things to happen. iDo Script Scheduler lets you take advantage of your Mac's power and put those scripts to work for you, when you're at the movies, on vacation, or even sleeping like a baby. " http://www.sophisticated.com/products/ido/ido_ss.html
  13. This is a small job for a skilled FMP person but a big problem for us. We are going to start selling out products on Amazon but we need some help. We need to have FMP6 log into our amazon account and upload our products for sale. The two problems are cookies and the username+password. We have been using the Troi URL plugin which has the capacity to set usernames, passwords, cookies, etc. We have run out of time before our launch to figure out how to get the plug in to work. What we need is for you to work out all the kinks so that we can use FMP to upload our inventory onto amazon. If you want to use the Troi URL plugin or figure out a different way to accomplish this task it it totally up to you. We also have FMP7 so if your solution involves FMP7 or 6 it does not matter to us. The filemaker version is 6v4 and we work on a G5 mac running 10.3.8. We are a small start up company so we do not have a large budget but we would love to pay someone for their time in taking care of this for us. We are located in Sacramento Calf.
  14. Thanks everyone for their imput!! I got BBEdit and upgraded to 10.3.8 os. My new problem is how to actually do the find and replace there are occurances of: " ;" " ;" " ;" etc So what I am doing is finding each occurace that is unique of the space and ; pattern and replacing it with ",". I am then renaming the extension to make it into a csv file. Once I did this and went to look at the file it had some problems. Am I missing something or is this the way to do it? Lee you stated you may be interested in helping me with the pattern, I would love some help. The sample attached file contains the an example of most of the multiple patterns. Thanks again everyone.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.