
Oyseka
Members-
Posts
268 -
Joined
-
Last visited
About Oyseka
- Birthday September 28
Profile Information
-
Industry
Retired
-
Gender
Male
-
Location
South Africa
-
Interests
Wildlife
FileMaker Experience
-
Skill Level
Novice
-
Application
18
Platform Environment
-
OS Platform
Mac
-
OS Version
12.2, 10.14
Claris Partner
-
Certification
Not Certified
Oyseka's Achievements
-
Hi, I have seen your handle on very many posts. No. It is an identifier of a complex product against which parts are used during a service and is part of the barcode that is scanned by the operator when servicing.This allows the operator to pull a BoM in the service and only supply parts that fit the product. The reason why I specify complex is that a product may be sold on a quote which accept no additional parts and therefore are not allocated a QItemSerial as no service can be carried out on it.
-
My apologies, I thought that you meant to make the existing QItemSerial validated as unique.
-
Really ! I thought that if a field is specified as unique that no matter where in the table a new record is created, validation would fail if that field held the same data as another record.
-
Yes, it is a one time legacy data correction.Thank you for the the lesson, this will do it. The issue with validating the field as unique is that every month a new record is created with the same QItemSerial, it is only unique in that quote.
-
Hi comment, yes I do want to identify them to either delete or replace the duplicates, it depends on circumstances that led to their creation. There are not supposed to be any duplicates in an individual quote but as the quotes are duplicated for each monthly service the QItemSerial is also duplicated. The QitemSerial is used to produce the barcode that is read into the handheld device during the service. An error on my part last year led to them being able to be duplicated so I now have to identify and rectify. With about half a million Quote Lines I am trying to script my identifying which Quotes have duplicates and isolating those duplicates.
-
Oyseka started following Clear documents from a temporary path , Find duplicates within found set , Disappearing field contents on printing and 1 other
-
Hi All I am trying to find if there are any duplicate serials within a quote but I just find ever line item and not the expected error 401. Any help appreciated. This is the find criteria. QuoteId = $QuoteId AND QitemSerial = ! The second screen is the Find result
-
Thank you comment, I was working on the file when you posted and have just discovered the problem. There is another invisible field on the layout which is obscuring the OrderNo I have now rebuilt the layout and the field contents are now visible. Thank you for you attention
-
Hi all, I am constuffulated trying to find why a particular text field will not print its contents. The contents are visible in both Layout and Browse mode and the Object Visibility is NOT selected to hide when printing. There is no Conditional Formatting applied and the "Hide Object When" condition is not activated. I have placed a duplicate of the field in the header and the data shows there for the selected record but it does not show in the body. The second screenshot is Browse Mode and the third is Preview Any help appreciated
-
It is exactly the same and when any development takes place,(at least once a week), a new file must be downloaded to each device and an initial sync has to take place.
-
The working files are offline and the main file is hosted. We spend a lot of time without even electricity never mind an internet connection in South Africa so we cannot work with an online file. We do in fact use Mirrorsync. It is not the data being synced that is starting to create problems it is the volume that has to be found and organised while on site. There is also the consideration of expensive bandwidth. If 20 people have to download a new file in the morning and perform a first sync that creates problems on our meagre resources so the idea is to minimise the data that the mobile devices have to both download and process while somehow also keeping a total database with the historical data that we need.
-
Hi All. I have a database that is currently sitting on 760MB with 425K records and from the trend it will grow at about 200K records a year now. as it is primarily used on mobile devices I think that this growth will lead to local processing problems. I am looking for a way to "hive off" any records older than say 18 months in the distributed version of the database while keeping all records in an "archive" version and then be able to add new and modified records monthly to the "archive" and removing the oldest month from the distributed version. There are 27 tables involved. Any pointers for achieving this would be appreciated *
-
Good day all, I have a script that collects several reports from different tables and assembles them to email via the following code. How do I clear all the reports after the reports have been sent,(or before the next set of reports are compiled). Set Variable [ $Path; Value:Get (TemporaryPath) &"Products Active" & " " & Year ( JobCardItems::DateNextServ )& "-" & Month ( JobCardItems::DateNextServ ) & ".pdf" ] Save Records as PDF [ File Name: “$Path”; Create folders:No; Records being browsed ] [ Document - ] [ Pages - Number Pages From: 1; Include: All pages ] [ Security - Printing: High Resolution; Editing: Any except extracting pages; Enable copying; Enable Screen Reader ] [ Initial View - Show: Pages Panel and Page; Page Layout: Single Page; Magnification: 100% ] [ Restore; No dialog ] Set Variable [ $AttachmentList; Value:If ( IsEmpty ( $AttachmentList ) ; $Path ; $AttachmentList & "¶" & $Path) ] I have tried just setting the variable $Path; Value: "" but that did not work
-
Thank you comment, that solves it
-
Hi All, I am trying to find duplicate records based on two fields so my fine criteria are: QItemSerial ! Type 1 It is finding duplicates but also finding records that are not. What am I doing wrong. The screen shot below has the records sorted by QItemSerial