Jump to content

ghengis

Members
  • Content Count

    22
  • Joined

  • Last visited

Community Reputation

0 Neutral

About ghengis

  • Rank
    novice
  1. I just did a find on the pilcrow ("¶")and replaced it with a semi colon. I then exported that field into excel and used the semi colon as a delimiter. I was then able to split the fields.I just imported the two fields back into Filemaker. Problem solved. I would still like to find a solution using Filemaker. I know there must be a better way of doing this. Thanks,
  2. Hi, I would like to be able to split a field into two fields. This field contains text see below: Bridge House¶63-65 North Wharf Road I would like to split this from the pilcrow. Important note to consider the pilcrow is literal. I have tried to use rightword in a calculation field but as the number of words vary from record to record I can't get a clean result.I'm not able to see a good text function that fits my needs. Any ideas? Thanks
  3. Excellent it worked! I spent ages trying to figure this out. Maybe I can help you out some day. Thanks again. I can sleep now.
  4. Thanks guys. my fault I should have explained myself better. What's a Pilcrow?
  5. I thought that was the case but this doesn't replace the ¶ in my field. At the moment field looks like this Title: Sent email¶Description: From: Peter Woodvine¶To: joshua.gledhill@mindshareworld.com I would like it to look like this Title: Sent email Description: From: Peter Woodvine To: joshua.gledhill@mindshareworld.com I know this is really simple but I'm stuck big time!
  6. Sorry to be a pain but should I use replace field contents and then use the substitute (my field;¶;¶&¶&¶)
  7. Thanks I will try it and let you know how I get on. This database is just a test DB that I'm using to experiment with. I'm trying to improve my scripting knowledge. :
  8. Hi, I need some help. I have a notes field that has ¶ characters in it. I would like to be able to replace ¶ with carriage returns so that my notes are not bunched up. See below for example: Title: correct target audience, req. copy of BXL¶Description: ¶Title: DS: has not read the info. ring tuesday¶Description: ¶Title: DS-ring early next week¶Description: ¶Title: DS-Ringing bck, has not read BXL yet¶Description: ¶Title: DS-Patrick, int. but not for4-6 weeks¶Description: How can I do this?
  9. Thanks for the advice. Let you know how I get on.
  10. Good points to ponder over. I would still like to identify those records that need attention. Its easy to manually spot records if you're only dealing with small amounts of records. What about dealing with thousands? I would like to identify out of a pair any records that don't have values in AddresLine1 and PostCode and then mark them for attention. This is why I needed to group the duplicates and then find those that have missing AddressLine1 and Postcode. I understand your points about defining a complete record.And I take your points on board. At this stage as I'm still a novice
  11. sounds good, but say I wanted to automate the deletion process. I don't want to delete the records manually. I would like a script to find the most complete record and delete the incomplete record. What do you suggest?
  12. Yes you got it. I want to group the pairs of dups. The reason I want to do this is because I want to deal with each pair and find the most complete record and mark the incomplete record for deletion. What I define as a complete record is a record that has a FirstName, LastName, AddressLine1 and PostCode. :(
  13. Hi, I have a file with 100 records and I have found 10 duplicates records. I would like to be able to group the duplicates. 10 duplicate records, grouped with a unique id. Producing 5 groups. group ID's 1-5. I know I have to find my dup's first then sort them. I'm stuck on setting the Group Id field. How do I do this? Thanks in advance. Ghengis
  14. Thats great it works!! Easy when you know how. I still don't understand everything you did. I have alot to learn. What about if I wanted to mark every 7th record? Once again thank you
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.