Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation since 12/06/2024 in all areas

  1. LaRetta was one of the most fiercely loving and loyal friends I've ever had, despite never having had the pleasure to meet her in person. I'm so blessed to have worked with her until the end. She was unashamedly opinionated and caring, about people, justice, and about our craft. She sent me this passage from Barbara Kingsolver late last year: And she followed it with: I love you LaRetta, and miss you dearly. Guess I'm a little hokey too. ❤️ Your friend, Josh
    2 points
  2. Here is another way you could do this. It uses conditional formatting to identify the exact items that cause the conflict with other orders. Again, more work is required if you want to see exactly which overlapping order has the offending item, but it might not be worth the trouble. BookingItemsConflict.fmp12
    2 points
  3. Is it safe to assume the context is an Order with a portal to LineItems? If so, I would do something like the attached. Re your calculation, I don't understand the logic it attempts to implement. And I don't think it can be done by calculation alone. Depending on the format you need this, there may be a much simpler alternative: sort your line items by ProductID and use a summary field that counts them with restart. This will work if you print your orders from the LineItems (as you should), as well as in a sorted portal. SimilarChildrenNumerator.fmp12
    2 points
  4. Databases work best with regular structures. That's not to say that an inconsistency such as this cannot be accommodated, but it won't be ideal. A lot depends on what do you actually intend to produce out of the data entered. I would probably opt for a "star" join table of Roles joining Staff (or StaffAssignments) to both Districts and Schools. And if a role can apply to multiple schools in a single district I would consider using a checkbox field to select the applicable schools - provided that it wouldn't conflict with some reporting ability you may want to provide. Please note that we are discussing an ERD, not the relationships graph. That will be a whole another issue.
    1 point
  5. Just for fun, you could also do: While ( [ values = Substitute ( Yourfield ; ", " ; ¶ ) ; n = ValueCount ( values ) ; result = "" ] ; n ; [ result = GetValue ( values ; n ) & Choose ( Mod ( n ; 2 ) ; ¶ ; ", " ) & result ; n = n - 1 ] ; result ) But this one will have a trailing carriage return, unless you change the output line to something like: Left ( result ; Length ( result ) - 1 ) or: Substitute ( result & ¶ ; "¶¶" ; "" )
    1 point
  6. To find only the exact value of of "Ice" you can make your script do: ... Enter Find Mode [ ] Set Field [ YourTable::YourField; "\"¶" & $searchValue & "¶\"" ] Perform Find [ ] where the $searchValue variable contains the text "Ice". But again, this is actually looking for "¶Ice¶" (the searchValue surrounded by returns) and will not find records where "Ice" is the first or the last value (without a trailing return). If your values are sorted by the SortValues() function, then you already have a trailing return and you only need to add a leading one. There is no need for double returns. Yes. The list field will function as a multikey, which means any single value will be matched: https://help.claris.com/en/pro-help/content/creating-relationships.html?Highlight=multikey Yes, if you want to have the global field in the same table (it can be in any table).
    1 point
  7. Gee, I wonder why no one has thought of that in 4 years. Let alone explain exactly how to migrate the data to a normalized structure...
    1 point
  8. I don't understand your question, especially this part: It seems you have a parent-child relationship, with DeliveryNotes being the child table. In such arrangement, there should be a foreign key field in the child table holding the parent record's unique ID value. There should be no fields in the parent table that refer to the parent's children. To display the parent's children on the parent layout you would use a portal. This is the standard practice for a parent-child relationship. If you have some sort of a special requirement that isn't covered by this then please explain in more detail. For example, if you wish to isolate a specific child record, you could click on it in the portal (thereby setting a global field or variable to its unique ID) and then use a 2nd relationship or a filtered one-row portal to display it. P.S. Please use the standard font when posting. P.P.S Please update your profile to reflect your version and OS so that we know what you can use.
    1 point
  9. Yes. Every script has its own parameter that must be passed to it explicitly when the script is called. To pass its own parameter to a subscript, the calling script would need to do: Perform Script [ “abc”; Parameter: Get ( ScriptParameter ) ] Keep in mind that a subscript does not "know" it's a subscript. It runs independently and does not inherit anything from the calling script.
    1 point
  10. This is a little too much to take in all at once. Consider simplifying the issue and/or breaking it up to individual points. Speaking in general, the script parameter remains constant throughout the life of the script, while a script variable can be defined and redefined at any point. If you want to export a file using a variable as the file's name, and the file's name is supposed to contain the count of exported records, then of course you will want to define that variable as close to the export as possible and not before any event that can modify the current found set. HTH.
    1 point
  11. In MBS FileMaker Plugin 15.3 we have a Format button on macOS for the data viewer's detail view. If the data is XML or JSON, we can use the format and colorize functions: JSON.Format & JSON.Colorize, XML.Format & XML.Colorize. Let's say you have some variables in the data viewer with XML or JSON content. When you double click the text, you get a new window showing the detail. Here we find the Format button added by MBS Plugin. Press the button and it will format the content: If we get a parse error or the content is not XML/JSON, we beep. If you click OK, the formatted text is stored in the variable. If you press cancel, the unformatted text stays in the variable. Please try the feature and let us know what you think. Available for macOS in MBS FileMaker Plugin 15.3.
    1 point
  12. Just to comment on this line of thinking: “I have a script that I would like to use in multiple "similar" situations”. Be aware that what seems economical can lead to highly conditional scripts that are difficult to maintain or change without risk. Separate scripts without much indirection or abstraction are more easy to maintain and troubleshoot. Consider the trade off you’re making.
    1 point
  13. You cannot use a variable to specify the field in the Edit Find Request window. But a find request can also be constructed through setting fields - and then you can use the script parameter to select which field to use, similar to what we discussed only recently here: https://fmforums.com/topic/110860-script-problem-remove-container-file-from-a-record-trying-to-transition-to-json-to-pass-multiple-parameters/#findComment-494081
    1 point
  14. As you can see from the release notes, folders for custom functions have been implemented: https://help.claris.com/en/pro-release-notes/content/index.html
    1 point
  15. https://www.fmcomparison.com
    1 point
  16. There are various tools that can compare DDRs such as: BaseElements https://baseelements.com/ CrossCheck http://www.fm-crosscheck.com FMDiff http://fmdiff.com/ FMperception https://www.fmperception.com/ InspectorPro https://www.beezwax.net/products/inspectorpro-8 I have no recommendation to make since I don't use any of them. You can also use general tools for comparing XML documents (either DDR or the output of Save a Copy as XML). The result may be more difficult to read but it could be all you need for a one-time task such as you describe. Oh, and for fields only you could simply compare the results of ExecuteSQL() querying the FileMaker_BaseTableFields table.
    1 point
  17. I don't know if that's a good analogy to your situation. Anyway the answer here is yes, at least WRT interference. They would simply login as different users and their privilege set would deny them access to any records other than those tagged by their account name. Then it's up to the developer to prevent situations where a bunch of records labeled <<no access>> would crop up - such as replacing the Show All Records command with a bogus find (any find will automatically omit records for which the user has no access). And there may be other details to consider e.g. serial numbering of records. Maybe not: https://support.claris.com/s/article/New-FileMaker-data-migration-tool?language=en_US
    1 point
  18. Actually, it's quite the opposite: you would have a List layout to show a bunch of records (typically non-editable). Then you use a popover or a card window to drill into a specific record for more details and/or editing. It's functionally similar to list-detail layout but you have more space to work with, since the detail temporarily conceals the list.
    1 point
  19. Yes, that's exactly what I meant. You are not alone in that wish, it's been often suggested (usually as a type of a layout part). As a side note: while I often switch between form and table view in solutions for my own use, I would almost always create separate layouts in solutions designed for others. So there would be a button for "detailed view" on the list layout and a "back to list" button on the form layout. Also keep in mind the possibilities offered by list-detail layouts, popovers and card windows.
    1 point
  20. When I read this, I was taken aback: I thought surely a button cannot override the layout setup and allow access to a view which the developer has disabled?? But you are right, it does do exactly that. I consider this a bug. I think you have no choice other than to customize the bar for its specific layout. It's not like we have the option to share objects across layouts anyway.
    1 point
  21. That's not going to work. Exactly. A button activates only by tabbing into it. Again, your hunch is correct. You don't need to add the selected button's object name to the script parameter. In fact, the buttons do not need to have object names at all (at least not for this). You only need to add a recognizable value to the script parameter of each button. It could be as simple as 1, 2 and 3 or perhaps something more explicit - say "current", "found" and "all". Then extract this value from the script parameter and use it to branch your script.
    1 point
  22. The simple method is to open a new window, isolate the current record and do the export. Then close the current window to return to the original found set. Alternatively you could switch to a layout that has the fields you want in the order you want them and do Save Records as Excel from there. But if that's all such layout would be used for, it's hardly worth the effort.
    1 point
  23. From what I can see the problem is that you are passing the container field's value instead of its name. Try defining the script parameter along the lines of = JSONSetElement ( "" ; [ "container_field_name" ; GetFieldName ( document::document_file ) ; JSONString ] ; [ "container_file_name" ; document::document_filename ; JSONString ] ) You could also get by with: JSONSetElement ( "" ; [ "container_field_name" ; "document::document_file" ; JSONString ] ; [ "container_file_name" ; document::document_filename ; JSONString ] ) but this would break if you renamed the container field.
    1 point
  24. Just a random thought: Whether a field is optimized for static or interactive content is a matter of formatting the specific instance of the field on a specific layout. You could have two separate layouts showing the same field optimized differently. Or two different instances of the same field on different panels of a tab/slide control. Or even just hiding one of them conditionally.
    1 point
  25. Not really. Your formula for constructing the JSON is correct and if the referenced field contains the text "Active" you should be getting the result you expect. Is it possible that the two tests were performed from different records?
    1 point
  26. It won't happen in the 1st file until you populate the Data__lxn field in the newly created child records. It will happen in the 2nd file, but only after you commit the record (that's a good thing: you don't want portal records to fly up and down while you're still working on them).
    1 point
  27. For the next version of MBS FileMaker Plugin in 15.3 we add the Window.SetRoundCorners function to provide round corners. At the recent Vienna Calling conference a developer asked if we can get the edges of the card in FileMaker to be round. And yes, that is indeed possible. Once the card is shown, the MBS Plugin can find the card window and apply round corners to it. This even works on Windows: This seems to work fine in FileMaker Pro on macOS and Windows. It does of course not work for WebDirect or FileMaker Go. To add the round corners, you simply call our plugin function Window.SetRoundCorners just after showing the card. The plugin finds the front window and applies them. Here is an example: Show card with round rectangle: New Window [ Style: Card ; Name: "Card" ; Using layout: “Tabelle” ; Height: 400 ; Width: 600 ] Set Variable [ $r ; Value: MBS("Window.SetRoundCorners"; 0; 12) ] Please try with 15.3 plugin and let us know how well it works for you.
    1 point
  28. This is a very simple arrangement. The left-most portal, where you select the category, is a portal that shows records from the current table (Category) - a.k.a a list-detail layout: https://help.claris.com/en/pro-help/content/creating-portals-list-detail.html Selecting a category in this portal causes the corresponding record to become the current record. And the portal to the Product table shows only records that are related to current record.
    1 point
  29. 🕯️ I was informed today of the passing of @LaRetta this past February. Thank you LaRetta for the many years of sage wisdom and insights to our community you will be missed!
    1 point
  30. Sad news to hear. She was kind, sharp as a tack and very funny. I was fortunate to have worked with her. She will be missed.
    1 point
  31. This is indeed a great loss to the FM community. No one can equal her sharp eye for mistakes and her ability to pull a great idea out of a bucket of mediocre ones. Above all, her good spirits and great sense of humor made it a pleasure to collaborate with her. It was a privilege to know her.
    1 point
  32. This is the second time you are posting a comparison between XSLT 1.0 and XSLT 2.0 and 3.0 and just like the first time it is full of inaccurate and false statements. This is absolutely and unequivocally wrong. XSLT 1.0 recognizes the following data types, defined in the XPath 1.0 specification: node-set boolean number string The XSLT 1.0 specification adds: result tree fragment as another data type (although it's no more than a special case of node-set). True, in XSLT 2.0 there are more data types - most notably date, time and dateTime. But that doesn't mean you cannot "perform real (as opposed to what?) arithmetic, date calculations, and type validation" in XSLT 1.0. There is only one Muenchian method - and whether it's a "convoluted workaround" is a matter of opinion. True, XSLT 2.0 introduced built-in grouping which is often more convenient. Often, but not always. Technically, it's true. But if you are running FileMaker Pro 2024 or later, you already can produce multiple outputs in a single transformation because the libxslt processor supports both the EXSLT exsl:document extension element as well as the multiple output documents method added in the XSLT 1.1 specification. This is true. But is is also true that a named template is not much different from a user-defined function. And again, the libxslt processor introduced in FMP 2024 does support defining custom functions using the EXSLT extensions. Not really. A "sequence" is just an expansion of the "node-set" concept to allow items other than a node and duplicate items. Hardly a "paradigm shift" and certainly XSLT 1.0 is also a functional programming language. Here is my conclusion: As I said in the previous round, there are very few things you cannot accomplish using XSLT 1.0 - esp. with the rich support of extensions that the built-in processor provides (see my post here). The most important point remains the question of performing the transformation as an integral part of importing and exporting records. Currently that's only possible with the built-in XSLT 1.0 processor (please correct me if I am wrong on this).
    1 point
  33. It wouldn't have worked even with an indexed field, because a value list based on a field will never include a blank value. You could define a calculation field in the related table that combines the "real" value with a placeholder, for example: List ( Valuefield ; " " ) Then define the value list to use values from this calculation field and (if necessary) make the target field auto-enter a calculated value substituting 3 consecutive spaces with nothing.
    1 point
  34. Well, yes - if you are joining 2 tables, and both tables have an id field, then you must specify from which table the values of id should be fetched, e.g. SELECT \"ESQL_Cities|Sel\".id instead of just: SELECT id As an aside, I doubt ExecuteSQL() is a good tool to use here. Do a search for "dwindling value list" to learn some FM native methods.
    1 point
  35. I suggest you take a look at the attached file (which is your original file reduced to the minimum + my suggestion). Do note that there are two fields stacked inside the portal. ActiveStaff.fmp12 That's a problem. If the field is formatted to use a value list showing only active staff, it cannot be used to select inactive staff. You will have to find a way to use another instance of the field in Find mode. Or switch to completely different selection method (e.g. a card window). Here is a rather simple one: ActiveStaff+Find.fmp12
    1 point
  36. The space character is the character immediately after the last hyphen in the text. Suppose the text were just: "a - b". The length of this text is is 5, and the position of the hyphen is 3. 5 - 3 = 2, so your expression: Right ( db ; len - lastSpace ) will return the last two characters of the text, i.e. " b". It depends. If the hyphen separator is always followed by a space, you might simply subtract it: Right ( db ; len - lastSpace - 1 ) A better solution would look for the position of the entire separator pattern " - " (a hyphen surrounded by spaces) and do: Let ( [ len = Length ( db ) ; lastSeparator = Position ( db ; " - " ; len ; -1 ) ] ; Right ( db ; len - lastSeparator - 2 ) ) This would allow you to correctly extract "Carter-Brown" from "Smith - Jones - Carter-Brown". If you cannot be sure the space/s will always be there, you may use Trim() on the result (this is assuming the extracted portion will not contain any significant whitespace characters).
    1 point
  37. That's normal if that was the case when you did your original sync with that database --Jesse Barnum
    1 point
  38. This list is full of inaccurate, even downright false statements. For example, both xsl:key and xsl:message are available in XSLT 1.0. Not to belittle the advantages of XSLT 2.0 and 3.0, it needs to be stated that XSLT 1.0 is Turing-complete - which means it can produce any output that depends solely on the input. True, some operations - such as grouping - are easier to perform in XSLT 2.0 +, but that's just a matter of convenience. If I had to point out the main advantages of the later versions, I would focus on: Dates: XSLT 2.0+ has dedicated functions to handle dates, times and dateTimes (what we call timestamps in FM), including the ability to generate the current date and time. Random: XSLT 2.0+ can generate random numbers. The XSLT 3.0 random generator is especially powerful. RegEx: XSLT 2.0+ supports processing text using Regular Expressions. JSON: XSLT 3.0 can both parse JSON input data as well as produce a JSON output. Still, even with this in mind it needs to be pointed out that many XSLT 1.0 processors support extensions that enhance their capabilities beyond pure XSLT 1.0. The processor embedded in FMP has always allowed producing the current date and time or generating a random number, as well as other goodies. And now, If you are using the latest versions of FMP, you also get access to a wide array of functions that manipulate dates. So really it's back to a question of convenience. The crucial point here, IMHO, is this: as a database developer, your interest in XSLT is purely for input and output. So I'll be watching the next installment to see if it provides a way to replace the embedded processor during import and export. If not, then there is very little attraction to having this available in a plugin. You can always do as I have done for a long time now and use the standalone Saxon processor from the command line.
    1 point
  39. I see two problems with your request: First, if you have a user named Smith and another user named Smithson there is no way to know when the user enters "the last letter of their login name" unless you already know who is trying to login (in which case why bother with a login procedure?). The other problem is much more serious: it seems you are designing your own "login" procedure that works after the user is already logged in somehow (possibly as a guest?). This is an extremely bad and dangerous practice - see here why:
    1 point
  40. Hello all, We're currently testing with FM Starting Point 24.0x5 USS and having trouble locating the best place to modify the default window size setting(s). Which script(s) should we be modifying to increase the default window size?
    1 point
  41. I am afraid we are not talking about the same thing. My suggestion is to divide the problem into two parts. In the first part you define a self-join relationship of the Orders table that identifies orders that overlap the current order's time span. This is easy to do using the existing, stored, fields of the Orders table: Orders::DateIn ≤ Orders 2::DateOut and Orders::DateOut ≥ Orders 2::DateIn and Orders::OrderID ≠ Orders 2::OrderID The second part is to identify which of the overlapping orders are conflicting - i.e. have a same product. This could be done in a number of ways, for example filtering the portal to Orders 2 by a calculation of: not IsEmpty ( FilterValues ( Orders::ProductIDs ; Orders 2::ProductIDs ) ) where ProductIDs is an unstored calculation field = List ( LineItems::ProductID ) Any records displayed in such filtered portal would be conflicting. You will have to make an additional effort to see exactly why they're conflicting, but perhaps it does not matter? Anyway, the idea is that the number of overlapping orders should be fairly small, so using an unstored calculation to find the conflicting ones among them should be sufficiently quick. Otherwise I see no choice but to move to a denormalized solution where the dates need to be replicated in the LineItems table, and you must take great care that this happens on every relevant layout, in every relevant scenario. --- Caveat: untested code.
    1 point
  42. What do you see when you select Manage > External Data Sources… ?
    1 point
  43. I need to correct something I wrote earlier: This implies that in normal circumstances the two tests should return the same result and that the only difference is the unnecessary complexity added by using PatternCount() instead of a direct comparison. That is not the case. Let's assume that both of the compared fields are calculation fields returning a result of type Date. And that the date format used by the file is m/d/y, with no leading zero for the month. Now, let's have an example where DateA is Feb 2, 2025 and DateB is Dec 2, 2025. These two are different dates and if the comparison is performed in the date domain: DateB = DateA the result will be False. But the suggested comparison: PatternCount ( DateB ; DateA ) will start by converting the dates to Text, and then: PatternCount ( "12/2/2025" ; "2/2/2025" ) will return 1 (True). In addition to a false positive, it is also possible to get a false negative if one or both of the fields contains user-entered data which may or may not have leading zeros.
    1 point
  44. That's actually wrong. You may not notice it's wrong if your date is never a Sunday, but in such case your formula will return the date of the following Monday - i.e. the starting day of the next week. The correct formula to use would be: date - DayOfWeek ( date - 1 ) + 1 ; I don't see that I made any suggestion regarding portal filtering - other than to warn you that it will get slow as your number of records increases. I think that could be simplified to: IsEmpty ( Employee::gFilter ) or Time::Week_Start = Employee::gFilter If that doesn't work the same way for you then there is something wrong with the data in one (or both) of the fields.
    1 point
  45. I don't know (I am currently stuck at v.18). But I wouldn't be surprised if it's still the same.
    1 point
  46. I see the same thing (in version 18). This is apparently a bug. But the solution is simple: do not go back to the script. And if you do, do not click OK. Or switch it back to 'File' before clicking OK. Or do not save the script changes.
    1 point
  47. I am not sure what exactly you are asking or what to look at in the attached file. From what I can see, the JSON in the GRANT::JSON field in the 4th record of your file is properly formatted - at least by the rules that Filemaker uses for formatting JSON (there is no official standard for this and you may see various online formatters return different results). Well, Grant in your JSON is also an array. The keys of any array are the numerical indexes of the array's child elements. The District array is a grandchild of Grant, and you will see it listed if you look at JSONListKeys ( GRANT::JSON ; "Grant[0]" ) or JSONListKeys ( GRANT::JSON ; "Grant[1]" ) and so on.
    1 point
  48. You can construct the calculation along the lines of: List ( "image:Photos/" & ArtistName & ".jpg" ; "image:Photos/" & ArtistName & ".png" ) This will create both paths as a return-separated list. The container field will then go over this list and display the first image it can find.
    1 point
  49. That makes it more difficult, since you cannot use the UniqueValues() function. See if the attached demo makes sense. In the real implementation, the value list would be defined to use values from the Customers table, of course. And the body part of the report would be removed (I kept it in just to check that the results are correct). Likewise the sub-sorting by CustomerID. You could probably do something similar with ExecuteSQL(), but then you would have to think how to present the result. CountUniqueInGroup.fmp12
    1 point
  50. Part 1: Embracing the Development Mind-Set Software development isn’t magic. There isn’t a black box where you can throw a bunch of ideas and requirements and out pops a smoothly working app that perfectly meets all of your business needs. Once you see that in print, it seems perfectly logical, but because the process is often hard to understand it feels like there is at least a little magic involved. In this seven-part series, we’ll pull back the covers and eXpose what you should know when you start the hunt for a professional developer for that custom, fix, or upgrade software project. We’ll offer tips to help you select the right developer, discuss pricing models, things to consider when signing an engagement contract, and walk you through the development process from idea to deployment. In this introduction, we offer a brief fundamental overview of what you should understand before you dive into any software development project engagement. Collaboration Development is a partnership between the client and the developer. You bring the knowledge of your business, your workflow and your needs. The developer brings the technical knowledge and software eXpertise. Both are equally necessary for development success. The developer should have a breadth of eXperience with various processes and technologies that will give you options to make your solution function smoothly within your workflow. But until you eXplain your business, your developer won’t know the intricacies of what you need. Even if you have an eXisting system, your developer still needs to know how you currently use it and how you wish you could use it. Knowledge From the developer’s perspective, it takes more than just a pile of papers, eXcel spreadsheets or a database to look at to understand your business flow. You are intimately familiar with how you do your job, often to the point where you could do it in your sleep. How things should work seems obvious to you. Rarely, if ever, will your developer be able to intuit the things you do by nature. You will have to eXplain it in great detail to make it clearly understood. This means that some things will need to be eXplained multiple times before the picture becomes clear. One strategy is to treat your developer like a new employee and teach your workflow step-by-step. You don’t have to teach all of the details of your entire business (unless the new application will manage the whole thing), but view the app like a job description and teach that job to your developer. That will make the functionality of the solution clear enough to represent the way you actually do business. It will also give your developer a foundation for making suggestions for improvement Perspective When building a full database solution you will end up looking at your business processes with a fresh eye as they go under the microscope while trying to properly eXplain them to someone new. Using development as a springboard, it is common to find things you want to change as you go through your business details. Software solutions reflect the business processes they represent. If those processes are inefficient, simply moving them from a paper representation to a digital representation will not make the underlying processes more streamlined or efficient. A custom app can make a process easier to manage, but will not fundamentally change it. Knowing this can put into perspective the effect the new software will have. Taking time to analyze eXisting processes with a focus on ways to improve them is a very important part of the development process. As part of that process your developer can make suggestions for improvements in the efficiency of managing data based on their previous eXperience with data systems. It’s up to you to decide whether the suggestions that come from your collaboration make sense to incorporate. eXpectations Once the development process starts the eXpectations on the developer can be a bit high. There is somewhat of an art to software development. Commonly when a feature is described to a developer it seems straightforward and sounds conceptually easy. Then when the developer begins to create that feature within the framework of the application there are often nuances to the feature or its integration into the eXisting database structure that weren’t anticipated. In this case the development time can be longer than eXpected because implementation of the feature ends up being different and often more complex than planned. This can create frustration for the client because the feature seems so simple to eXplain or straightforward when done manually. You might say, “We always … ” but the truth is, there’s probably at least one eXception to your rule. The eXceptions are easy to handle on paper or verbally, but every eXception has to be coded into the final working product. eXceptions are generally complex because they branch away from the established flow. Translating a manual process into an automated electronic process is most often like a duck on water. There is a lot of work and complexity under the surface to make the feature effortless to use. That takes time to figure out and then create. The duck on water is the magic. Read Part 2 of 7: What Should You Consider When Selecting a Development Partner? What Questions Might You Ask a Potential Developer? The post Survival Guide: Find, Hire and Work with a Software Developer, Successfully! (Part 1 of 7) appeared first on eXcelisys. View the full article
    1 point
This leaderboard is set to Los Angeles/GMT-08:00
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.