9 thoughts on “JSON Array + Virtual List”

  1. Thank you so much for your explorations of virtual lists and JSON. The use of just three repeating fields to cover different data types is brilliant and reaffirms that repeating fields still have a useful role to play. Being able to sort by the actual virtual list fields is also a great feature.

    I have been experimenting myself and as mentioned in other comments on the subject, some JSON functions get really slow with large found counts. I imported the data from one of the previous demos to perform some benchmarks. I found that the time to create the report reduces very considerably if I collect the rows in a text list and convert to JSON outside the loop with a Substitute

    1300 records reduced from 11.465s to 1.744s
    2500 records reduced from 43.138s to 4.259s
    5000 records reduced from 155.086s to 11.096s

    Hopefully, the performance of the FileMaker JSON functions will improve in future versions, but for now it seems that it is much faster to use text functions to create JSON especially when dealing with high record counts.

    Below are the changes I made to gain the speed improvements…

    Set Variable [ $arrayRow ; Value: Quote( $index ) & “:” & “[” & $date & “,” & $type & “,” & $customer & “,” & $cost & “,” & $price & “]” ]
    Set Variable [ $$arrayList ; Value: List ( $$arrayList ; $arrayRow ) ]
    Go to Record/Request/Page [ Next ; Exit after last: On ]
    Set Variable [ $index ; Value: $index + 1 ]
    End Loop
    Set Variable [ $$arrayJSON ; Value: “{” &Substitute ( $$arrayList ; [ “]¶” ; “],” ]) & “}” ]
    Set Variable [ $$arrayJSON ; Value: JSONFormatElements ( $$arrayJSON ) ]
    Go to Record/Request/Page [ First ]

    Thanks again for a great series of articles

      1. Kevin,
        My guess is that because the closing } has nothing to identify which JSON entity it is the end of, any JSON validation and processing has to basically count the { and } to work things out.
        There are, I think, other FileMaker text processing functions that suffer from degrading performance in a similar way (I experienced this big time when working with the sync payloads using FMEasySync).
        As always with FileMaker, we just have to decide which is the most efficient tool in the box to use in any given situation!
        It might be worth experimenting with having a field calc to extract individual rows from the Virtual List JSON and then have the display field calcs based on this. It might be a red herring as all the calcs would be unstored, but probably worth a look.
        I was also wondering, when comparing your different approaches to the JSON virtual lists, if the array approach is faster than named JSON entities when decoding the data.
        Some more testing required methinks…
        Regards, PJ

  2. Kevin,

    I’m not sure if there was a reason for creating a list of JSON objects vs. a nested array, which seems a lot cleaner.

    This:
    [
    [ “2017+07+14”, “Walk-In”, “Coughlin & Gehart”, 136.79, 224.24 ],
    [ “2017+07+16”, “Internet”, “Modesto Philharmonic”, 319.3, 939.12 ],

    ]

    Instead of this:
    {
    “0” : [ “2017+07+14”, “Walk-In”, “Coughlin & Gehart”, 136.79, 224.24 ],
    “1” : [ “2017+07+16”, “Internet”, “Modesto Philharmonic”, 319.3, 939.12 ],

    }

    Also, I always use the JSONSetElement function to create the JSON object/array to avoid the types of issues you are experiencing with numbers.

    I uploaded a modified version of your demo with the JSON tweaks. You’ll see that you no longer need the protected version.

    I also added what I think could be an optimization, but who knows until it’s tested. I created an unstored calc field called “row” in the virtual_list table, which pulls the array index. The other fields (e.g., text, number, date) reference the this field based on the assumption that it will be faster to reference this field than extract a value from the array three times.

    https://www.dropbox.com/s/sn7s2gmijs9t8r1/JSON%20Array%20%2B%20Virtual%20List-NestedArray.fmp12?dl=0

    1. Hi Dave,

      Thanks for taking the time to explain that. Your improvements make perfect sense.

      Many of us are taking baby steps with JSON; clearly you’re further along the learning curve.

      Regards,
      Kevin

    2. Dave,
      Thank you for the modified version, it is great to share our knowledge and experiences and this article and the comments have certainly helped me.

      It does make sense to use the native functions to create the individual ‘rows’ and to use the native JSON index in the field calcs. The field ‘row’ is also more or less what I was thinking about in my previous comment. I added my modification to use the List function to combine the individual row arrays and ran some tests again on higher record counts.

      1300 records: array JSON 11.518s – List JSON 4.872s
      2500 records: array JSON 35.789s – List JSON 5.668s
      5000 records: array JSON 133.450s – List JSON 8.080s

      Using the List method appears to be considerably faster. This is a shame as it would be much tidier to use the native JSON functions.

      Regards, PJ

    3. Dave,

      Your “array_row” optimization is good, but it can be even better. Playing with a small JSON array, I’ve found out that transforming it into a value list increases processing speed in four (!) times. To try it out:

      Change the script step 21 to:
      Set Variable [ $$arrayJSON; Value: JSONListValues ( $$arrayJSON ; “” ) ]

      Change the field “array_row” to:
      GetValue ( $$arrayJSON ; ID + 1 )

      1. Alex,
        Thanks for pointing this out. This is consistent with the fact that is was much faster to build the JSON for larger found sets using the List() function. In my tests with 5,000 records the rendering time of the final report is much faster. It is hard to put an exact figure on it as it seems that the rendering of the layout happens after the script has completed and so is not included in the recorded processing time.

        1. Paul,

          I used the “list” optimization for importing a huge number of small JSON arrays, so the “rendering” time is more important to me, and I thought that my experience will be useful here, too.

          In the test I’m getting the same element’s value in a loop. Before the second loop I set $LIST to JSONListValues ( $JSON ; “data” ):

          JSONGetElement ( $JSON ; “data[4].id” )
          12,995 turns / 5,000 ms
          2.6 turns/ms
          .38 ms/turn

          JSONGetElement ( GetValue ( $LIST ; 5 ) ; “id” )
          68,501 turns / 5,000 ms
          13.69 turns/ms
          .07 ms/turn

Leave a Reply to Paul JansenCancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.