Jump to content

Angus McKinnon

  • Posts

  • Joined

  • Last visited

Profile Information

  • Gender
  • Location
    Glasgow, Scotland

FileMaker Experience

  • Skill Level
  • FM Application
    15 Advanced

Platform Environment

  • OS Platform
  • OS Version
    10.13 High Sierra

Recent Profile Visitors

2,274 profile views

Angus McKinnon's Achievements


Enthusiast (6/14)

  • Dedicated Rare
  • First Post
  • Collaborator
  • Conversation Starter
  • Week One Done

Recent Badges



  1. That's worked perfectly, although the full layout has certainly got my computer working hard! There's a few variations that I can do now that the basic structure is sound, it looks like it's going to be very useful for the analysis we're planning. The "heatmap" effect does work. It's not particularly accurate because I've only got a few states for the conditional formatting, but zoomed out on a 4k monitor it gives a pretty good impression of the general trend.
  2. You're right - it's not quite a one-off, but not the sort of thing that's going to be rolled out on a weekly or monthly basis. The number of orders does vary quite wildly between customers. We have a few outliers in the 100-500 range, most are under 50. However for the purposes of this report I think everyone would be quite content with the first ten orders each customer makes, by that point the pattern has been established. (We can also use the average order value as a comparison, that will take into account any dramatic changes outwith the range of ten) Speed isn't too much of a worry, I can always leave it to crunch and export a PDF for viewing.
  3. I've chewed over this a bit more over the past few days. (And done a bit of reading around on KPIs and other business metrics.) The simplest way I think the query can be expressed is by a table/spreadsheet style layout. Each row is a customer, and their order totals appear in the columns. This can obviously be read and interrogated line by line, but I'm thinking if I apply a little conditional formatting on each cell, it'll look like a "heat map" of sorts if I zoom out far enough. I'm a little stuck on how to do the calculations behind the layout. I can't run portal rows horizontally. I could use the List command to return a list of related order records, but that'll give me a single text result, rather than the individual totals that I could apply conditional formatting to. Possibly populate a repeating field with the related data?
  4. Thanks Steve - you make a good point. We've got about 2000 customers to consider, if we take only orders from the past 2-3yrs. I could arrange some kind of report listing by customer with the totals of all their orders arranged, perhaps with some conditional formatting to indicate whether each subsequent order is bigger or smaller than the baseline. I suspect it'll be a bit difficult to comprehend at a glance, though. The sort of finding that would be useful is if I can say along the lines of "40% of customers who place a <$50 order go on to place a >$100 order within the next year". From a business perspective, the problem is that quite a lot of decisions about premises and staffing are being dictated by the need to service these very small orders. They'd be tolerable as "loss leaders" but not if that's all those customers ever spend. Retooling to handle more volume of small orders would be possible, but it's hard to justify the spend and effort when there's other business to chase that's much more profitable. The situation you had with the homeowner is a good example of how the process should work. I'm guessing you made a good impression and did the job well. Crucially, the homeowner had other work needing done in future. What I'm wondering is if the customers placing <$50 orders are never going to order anything larger. It'd be like doing lots of bitty maintenance jobs in the hope of being asked back for a major remodel, only to find that the customers are tenants rather than homeowners and only likely to ever pay for the bare minimum. (Amazon, BTW, barely break even on their retail operations, it's web services that bring in the profits. I suspect shipping even the smallest orders is part of the process of training customers to go straight to the Amazon site without even considering anywhere else)
  5. It's the first time in a while that I've had something to really get my teeth into with Filemaker. I'm needing to run some analysis on an invoicing database. The basic structure of the database is pretty much as you'd expect with a table each for customers, orders, and line items on these orders. This is essentially the conversation that gave rise to this task: "Small orders don't bring in much money and are a lot of hassle. Let's impose a minimum order limit or find ways of discouraging these kinds of orders" "Ah, but customers start off with small orders, and come back to us with larger ones later" So what I'm essentially trying to find is, I guess, a way of identifying a trend. I was wondering about comparing a customer's average order size with their first order. That seems like it might be straightforward to implement but I'm not sure if there are better ways to represent the trend. Also I'm thinking that percentages won't be much use here, someone starting with a $10 order and coming back with a $30 order would be a 200% increase but both would be in the "too small to be worth the hassle" bracket. The hope is that there are people starting with $10 orders who have gone on to spend hundreds or thousands. I am a complete novice when it comes to statistics, so I want to make sure I'm not inadvertently skewing the picture. We have roughly 2000 customers to get through, so I'd far prefer something that isn't the manual approach. Any recommendations for how best to tackle it?
  6. I've thought a bit more about this (rather than just jumping straight in for once) I reckon I'll need a join table between Items and Products, since Items can appear in multiple Product offerings (i.e. as part of a package or as a single line item themselves). Does that make sense?
  7. At the moment, I'm trying to tidy up some of the inventory databases that my company has spawned, and ideally have one solution covering everything. (At the moment data is spread over a few different, standalone files that have been made up ad-hoc for specific purpose). I've spent a bit of time working out what the users are going to need, and how we can re-use the existing data within a new structure. What's evolved is essentially a three-tier table structure, as follows: 1 - Products This table holds the products and packages that are offered to customers and invoiced for. The users are typically picking products (via a lookup portal) and adding them to invoices. 2 - Items This table has one record for each type/model of item that we own. (Irrespective of the quantity). It's based on actual physical items rather than sales packages. 3 - Pieces This table holds one record for each individual physical item. It'll contain test results and manufacturers serial numbers for equipment that requires it. The theory is that users would continue picking products as they do at present. Each product would link to one or more items, and the count of related pieces would be used to calculate availability and highlight shortages etc. Does this sound like a reasonable enough plan? I'm not sure I can think of any better way of organising it. It's tempting to do away with the Product table entirely, but the problem is that what appears on a customer invoice is not what is actually handed out, they get a single line item saying "Neat package" with a price, and that package can comprise two or more Items. (The names of which would be gobbledegook to many customers) Further complicating matters is the fact that Items can be used as part of a package, or invoiced individually, if a customer chooses to take just that item alone. Of course, one item going out individually would prevent a package from being built with it. Are there any neater ways of doing what I'm trying to achieve? Or can anyone spot any pitfalls that I'm about to blunder into?
  8. One of my Covid-lockdown projects is adding some inventory control to our Filemaker solution. We started with this some time ago, and most of our rental items have got barcoded ID tags. The management of that has lagged somewhat behind. We have wireless barcode scanners, and one of the tasks I have in mind is verifying that items listed in the database exist in real life. (I suspect that some labels have been put on without records being created) In theory, it shouldn't be difficult for someone to work their way along a bay of shelves blipping barcodes whilst a looping script runs in FM. The barcode reader itself beeps to let the user know that a barcode has been scanned successfully. What I need is some kind of audible indication from Filemaker that the item scanned has a matching record, and an alternative indication when it doesn't. The Beep script step is the obvious one to use, but it only gives one sound. Even a high and low pitched beep would be enough to do what I want. The other option seems to be playing sounds in a container field, but this seems a bit cumbersome for the application. The best alternative I can come up with is using multiple Beep steps with pauses inserted in between, so we get a quick double-beep for a positive scan and multiple slower beeps indicating a problem. Are there any other approaches that I've missed?
  9. Just returning to this thread with an update. It all seems to have gone swimmingly. There were actually two relationships that needed "fixed" in this way, and everything seems to be working as it should on the new serialised match field. Thanks again to comment for his input. Using "Replace Field Contents" saved me some scripting which would have had no long-term benefits.
  10. That's great - thanks. Good to know I'm not barking up the wrong tree, and I'll definitely have a backup before kicking off the process. Apologies for the font - I'd written up the post in Word whilst offline, then copied and pasted over. Slightly ironic since one of my other housekeeping tasks at the moment is adding TextFormatRemove(self) to some of my fields to stop my users from causing similar problems...
  11. I’m using some of the current downtime to tidy up some of the murkier corners of my Filemaker solution. One of the problems in my crosshairs is a badly-chosen match field that I want to replace. The relationship in question is between a catalogue table (a list of all the stock items that we have available) and an Order_items table, which is the line items on customer invoices. Foolishly, I chose to use a website URL as the match field for the relationship between the two. At the time, all of the items in the catalogue had a matching page on a website, and it was by definition unique. Seemed like a good shortcut at the time, but of course we’ve since needed to add some items to the catalogue which don’t have a website page. Worse, some website URLs have been updated, and we can’t change the URL field in the catalogue table without breaking all of the existing relationships. I’m planning on changing over to a simple serial number to use for each catalogue item, independent of what’s going on in the URL field. I’m just wanting to double check that I’m not missing anything with the procedure to make this happen. Here’s my plan: 1) Create the new Serial field in the catalogue table, and populate it in the existing records with serial numbers (and make sure the auto-entry is set correctly for future additions) 2) In the Order_items table, create a matching Serial field, and insert a calculated result into each record, pulling the number through from the catalogue record using the existing relationship 3) Change the relationship to be based on the Serial fields rather than the URL I’m reasonably certain that this will work properly, but I’m keen to avoid any potential pitfalls. Or is there a better way of doing this?
  12. That's incredibly helpful - glad to know that I'm not the only one hitting this kind of challenge. (Although the other poster described it far more accurately than me) I've downloaded the demo file, and to be honest that will be enough to give to the administrator to experiment with. It'll certainly save a chunk of time compared to the manual method. I could probably find a way of integrating the capability into our main solution (and being able to automatically copy in the totals from the unreconciled orders) but it's overkill unless they are going to use the core functionality.
  13. I've had an interesting challenge thrown in my direction by one of our administrators. Part of their job is to reconcile credit card payments that we have received, against orders that are in Filemaker. 98% of the transactions are simple to reconcile, but the remainder can be problematic. This is often caused by other staff not marking FM orders as having been paid, but nonetheless it still needs picked through. The problem is compounded by the way we receive card payments into our bank account, there are usually a number of payments lumped together for the day(s) in question. This makes it hard to spot unreconciled payments. If there is one individual payment for, say, £45 landing in the account, and an unmarked order worth £45, that's easy to marry up. Where it gets more complicated is when there are three or four unmarked orders, and all (or none) of them could be attributed to the one composite payment that has landed. What I'd like to be able to provide for them is a calculation function that shows how different combinations of totals would match with the payment. Here's a grossly simplified example: Unmarked order 1: £5 Unmarked order 2: £10 Unmarked order 3: £15 Total received: £15 Anyone glancing at this can tell that the total received could have been made up of Order 1+2, or Order 3. It's much harder (and more time consuming) to work it out manually when there are multiple larger, irregular numbers to consider. I'd like to automate this process for them if possible. I can think of a script that plods through each potential combination, but this is going to get exponentially more cumbersome and very time consuming to write. I'm guessing that there must be a more elegant way of doing it, but it's well outside the areas I'm familiar with. I suppose it's essentially a math problem, which is not one of my strengths! Can anyone suggest a good way to attack this?
  14. The database that I maintain at the moment has a simple text field for “Order Notes” which is used for various miscellaneous notes that don’t fit into any of the other fields. It’s a bit of a “catch all” receptacle. A couple of problems have emerged recently. One is that staff have been relying on notes that have been written quite some time ago, treating it as “gospel” when in some cases it has been superseded. What I’d like to do is have an indication on the field of when the information was input. It wouldn’t be hard to add a timestamp that is changed when the field is updated. The drawback I see with that approach is every alteration, no matter how minor, updates the timestamp, so we could have a recent timestamp, but old information, just because someone tapped the spacebar in the field. What I’m thinking about is storing each iteration of the Notes text as a separate record in a related table. There would be still only be one notes field visible to the user, but the various versions could be checked through if required. It doesn’t seem too complicated to have a script trigger when the information in the Notes field is committed, that copies it into a new related record. Then some form of portal to allow users to quickly flick through the versions. But I have a nagging feeling that this is a bit of a bodged inelegant way to go about it - is there a more straightforward way of achieving this that I’m missing?
  15. Thanks for that link - seems like a very sensible way to tackle it. I'm imaging that the best way to proceed is have a script set variables from the summary values out of the filtered portals (using GetLayoutObjectAttribute) and use that to populate the summary table. Let's see how I get on...
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.