Bill of Materials

29 Mar 2019: Demo file has been updated to v2. See comments for details.

Editor’s note: Geoff Gerhard of Creative Solutions is one of those “quiet” developers who has been helpfully sharing his knowledge in online FileMaker forums for more than 20 years. I personally have benefited enormously from his expertise, and am pleased to feature his first appearance as a guest author here on FileMaker Hacks.

Demo file:

Bill Of Materials

Back when FileMaker 3 was new, I added inventory management to a FileMaker template that allowed an Assembly Item to be a collection of other Items, including other Assembly Items. The inclusion of other Assembly Items (which might also contain Assembly Items) means that a full Bill Of Materials (BoM) for a “parent” Assembly Item is best represented as a hierarchical list that clarifies the relationships between each component and its “ancestor” and “descendant” Items. This arrangement is often called an “exploded” or “multi-level” BoM that might look like this:

  • Parent
    • child
      • grandchild
      • grandchild
      • grandchild
    • child
      • grandchild
        • great grandchild
        • great grandchild
      • grandchild
      • grandchild
    • child
    • child…

Adding a parent Assembly Item to Inventory requires sufficient quantities of all descendant materials and Assembly Items, which requires walking the entire multi-level BoM to ensure that sufficient quantities of each material is available to produce the required quantity of that material’s parent Assembly Item.

The overhead in FM 3 was not trivial: multiple global fields, multiple relationships, and lots of similar-but-not-identical script steps were required to facilitate the process. The depth of the hierarchy was limited by the need for separate utility relationships, fields, and script steps at each descending level while walking the lists of materials and subassemblies required by a parent Assembly Item.

I recently had a chance to revisit the process for a client that makes products with deep multi-level BoMs. Because theirs is a just-in-time operation, they wanted to identify material requirements with two parameters: how many units of each material are needed, and when should they order each raw material (and begin work on each Assembly Item) to meet the scheduled completion date of the parent Item.

Some Assembly Items can be purchased ready-made, so we identify the “family tree” of Assembly Items that should be purchased when there are not enough days between the “Start On” and the “Targeted Completion” dates to produce the assembly in-house.

The example file uses $variables in place of global fields, SQL instead of multiple relationships, and a recursive script framework that can handle an unlimited number of generations in the multi-level BoM. As a relative newcomer to SQL queries in FMP, I was pleased to discover how much power they offer developers in situations like this. Three examples:

  1. They have no additional impact on the Relationship Graph.
  2. You can add prefix and suffix text strings to each value in each record/row returned by the query.
  3. You can (via JOIN) include data in the response rows that comes from an on-the-fly relationship that otherwise doesn’t exist on the RG.

I use a name/value pair method for passing and returning multiple parameters to and from scripts. I leveraged #2 in the SELECT segment of the ExecuteSQL function to produce a response that is a list of ready-to-evaluate name/value pair functions that will supply the relevant values and attributes of each matching record as the script progresses. I use that list as the Value parameter in the name/value pair function with the Name parameter “children” and use the result (concatenated with other name/value pairs) as the script parameter passed to the “SubMakeBOMList” subscript. That subscript instantiates the list as $children and walks through it, instantiating one row of variables with each loop iteration, and recursively calling itself every time the targeted row is an Assembly Item.

The example includes a Days field that identifies the lead time to receive a purchase order for that item (or to produce an Assembly Item) once all descendant material and Assembly Items are on hand. It also includes a quantity (Qty) field that identifies the number of child units required to make one parent unit.

Processing each row updates values that track the cumulative number of Days and the total Qty needed (by multiplying the row’s Qty by the parent Qty). The subscript tracks the total iterations through the loop, and adds the current iteration number as a suffix to the Name parameter of each name/value pair. This is key to how Virtual List identifies the value for each record.

The subscript also tracks its current recursion level and includes that value in one of the name/value pairs, which is also relevant to the display of the Virtual List. The loop appends the updated and row-specific name/value pairs to the $array (as a ready-to evaluate text string) until all rows in the $children variable are processed. It returns that list as the value of the “retArray” name/value pair, which is captured via script result and appended to the current $array variable.

When the last row of the parent Assembly Item’s $children variable is processed, $array contains every descendant row of the full multi-level BoM and is returned to the parent script for processing into $variables used by or displayed on the Virtual List records.

Things that I think are interesting or noteworthy:

  • The script includes error trapping for circularity—a descendant Item’s BoM cannot include any of its ancestor Items—and shows a message that lists all ancestors and identifies the one that created the problem.
  • The script also identifies any Assembly Item (and its descendants) that, if produced in-house, will exceed the number work days within the Start On and Targeted Completion dates and displays them in a message.
  • The script result could be leveraged to aggregate the materials and assemblies by Start Date by adding another VL layout, or extended to include a Due Date.
  • The production of the ready-to-evaluate array of name/value pairs could, I think, be reduced to a recursive custom function.

The example file stands on the shoulders of giants: it uses Bruce Robertson’s Virtual List technique to display the multi-level BoM and there are calculations derived from the shared work and ideas of others. (Work Days subtracts weekend days within the date range thanks to a calc by Ray Cologon.)

For an alternate take on the mechanics of building a multi-level BoM, Daniel Shanahan has published a much more comprehensive example file addressing the attributes of inventory management when confronted by a multi-level BoM in a production environment.

Addendum: Name/Value pairs + ExecuteSQL

I’ve only sparingly used ExecuteSQL in projects until recently. A new project was well suited to its use, and I “discovered” what I think is an interesting use that I haven’t seen or heard much about: the use of SQL queries to produce ready-to-evaluate name/value pairs using concatenation.

I’ve seen several examples of developers inserting a unique string of characters as a field separator, and using native functions or custom functions (CFs) to tease out the individual fields’ values from a row.

As a longtime believer in passing name/value pairs via script parameter and script result, I put SQL concatenation to work. I have a CF (named “#” for easy typing) that takes two parameters: # ( “Name” ; “Value” ) and prepares them for instantiation as $Name ( = Value ) when passed as the parameter to another CF (named setLocalVars). Multiple prepared name/value pairs can be concatenated together as the single parameter in setLocalVars, which instantiates them all at once when it evaluates and creates a $localVarsList parameter to show what was instantiated.

This makes it easy, when debugging, to see each of the returned fields’ values without having to scan for and count the field separator parameters. It also means there’s no need for a “dictionary” type CF to make use of named values, and there’s no dependence on the order (sequential positioning) of the passed name/value pairs.

In my recent project, I needed to process related child records that might have children of their own, and maybe grandchildren, great-grandchildren, etc. Creating rows of name/value pairs to pass in a script parameter seemed ideal, since the script could then instantiate a row in each iteration via a loop. To create the ready-to-evaluate strings, I needed to concatenate each value returned by the SQL query with text to create a text version of my name/value pair function.

I also needed to insert FMP operators between the function strings so that each row was a text string that would concatenate all the function results it contained. The Evaluate () function, with one row as its parameter, produces a concatenated string of formatted name/value pairs. That result is used as the parameter in the SetLocalVars CF to instantiate all the row’s name/value pairs as $Name variables.

For those who haven’t tried concatenation, it’s used in the SELECT portion of the sqlQuery parameter used by the ExecuteSQL function. FileMaker uses the “&” to concatenate, while SQL can use “+” or “||”. I used the latter. SQL uses single quotes to identify text strings that will be concatenated with values returned by the SELECT parameter. The ready-to-evaluate Name and Value parameters need to be inside quotes to work within the Evaluate function, like this…

# ( "fieldIdentifyingName" ; "responseValueFromSQL" )

…so I needed to escape them within the concatenation text strings. A relatively simple version of SELECT that includes concatenation strings for the fields childID, qty, and leadDays looks like this…

ExecuteSQL ( 
SELECT '# ( \"myID\" ; \"'||childID||'\" ) & # ( \"myQty\" ; \"'||qty||'\"
) & # ( \"leadTime\" ; \"'||leadDays||'\" )'
WHERE parentID = ? "

; "" ; "" ; Item::pk

…and produces this result…

# ( "myID" ; "10010" ) & # ( "myQty" ; "5" ) & # ( "leadTime" ; "0" )
# ( "myID" ; "10015" ) & # ( "myQty" ; "3" ) & # ( "leadTime" ; "0" )
# ( "myID" ; "10013" ) & # ( "myQty" ; "2" ) & # ( "leadTime" ; "0" )
# ( "myID" ; "10024" ) & # ( "myQty" ; ".001" ) & # ( "leadTime" ; "0" )
# ( "myID" ; "10025" ) & # ( "myQty" ; ".001" ) & # ( "leadTime" ; "0" )
# ( "myID" ; "10007" ) & # ( "myQty" ; ".1" ) & # ( "leadTime" ; "0" )
# ( "myID" ; "10023" ) & # ( "myQty" ; "3" ) & # ( "leadTime" ; "0" )

…which is paired as the Value (with the Name “children”) and passed as one of several formatted name/value pairs to a subscript. That subscript instantiates all name/value pairs, and the loop instantiates one row of $children’s name/value pairs with each pass. The example file uses a slightly more complex sqlQuery parameter, using JOIN to grab additional field values that only exist in the Item table.

Geoff Gerhard has used FileMaker Pro since 1992 to build systems that handle complex inventory, manufacturing, and accounting issues. He’s been integrating FileMaker solutions with QuickBooks for his (and other developers’) clients since 2003, and frequently answers questions on the subject in various online developer forums and email lists.

4 thoughts on “Bill of Materials

  1. Alex Zueiv


    Great work! The only problem is that your method of building name/value pairs with ExecuteSQL() fails if any field contains quotes. I used this technique for a while and have a solution.

    To escape quotes in a field I double-escape them in the query:

    SELECT ‘myID\\\”;\\\”‘ ||, ‘myQty\\\”;\\\”‘ || BoM.Qty, …

    Then I quote the result and “evaluate” the double-escaped quotes only:

    Substitute ( Quote ( RSLT ) ; [ “\\\\\\\”” ; “\”” ] ; …

    The whole solution is a bit longer, so you should better check it in your fixed file:

    With this fix your client will be able to make sandwiches with Salami “Del Pueblo”… =)

  2. Geoff Gerhard

    That’s embarrassing, as I’m well aware of the consequences of “unescaped”
    characters inside an Evaluate function.

    I’ve addressed the “escaping problem” in the updated demo file by applying
    a different method, having struggled with the “stacked escaping” (“\\\”)
    method years ago. The new version handles the unlikely-but-still-possible
    case that a returned field value includes a backslash and/or a backslash
    quote ( \” ) sequence.

    I liked the readability of your indention, white-spacing, and “one target
    field per line” in the SELECT argument, so I followed that model in the
    revised Demo file’s calc. (More shoulders to stand on!)

    Your amendment also highlighted the possibility of abstracting some of the
    concatenation by using the fieldSeparator and rowSeparator parameters to
    insert some of the text strings with less typing or cut/paste. It works
    well when all field values will be wrapped in identical strings.

    That said, I retained the SQL concatenation of all fields instead of
    applying a custom fieldSeparator, imagining there may be cases where
    additional functions and or concatenations might be applied to some values
    in a way that makes the extra verbosity useful.

    Using a teletype character as the rowSeparator parameter let me retain the
    requisite “one row per matching record” result without vulnerability to
    other unlikely-but-still-possible strings in the field values. It’s
    replaced in the final searchString/replaceString pair in the Substitute

    Thanks, Alex, for sharing your insights.

    1. Alex Zueiv

      Glad to be helpful.

      Your solution works even better, as mine does not support escaped quotes (\”). I’ll probably use something similar in the future.


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.