It is becoming increasingly necessary to move easily and reliably data from one file to another, and I’ve been recently faced with a similar problem.
I developed a solution to send electronic invoices to the State Administration (for those who are maniac enough and fluent in Italian, details here: http://www.filemakerguru.it/emergency-room/fatturazione-elettronica-verso-pubblica-amministrazione/).
Briefly you need to create an XML file containing the invoice data; many fields use codes provided by the administration. You need to check that all mandatory fields have been filled, that field format and length falls within given limits, and that some fields have a value if some conditions are met.
So far things look easy to accomplish using value lists and field validations. The tricky part however is right behind the corner: codes and conditions will change in future releases.
Ouch. I mean Ouch!!
What will it happen to my customers when new releases get out? Will I send them a new version and a large old-data-import routine with it? Doable, already been there, but didn’t like it at all. And how about the customers for whom I integrated the file into their solutions?
Hmmm, there must be a better way to have an updateable solution.
Demo files: wrap-and-unwrap-v1.zip
I ended up moving all the logic into a Controls table, and a script using plenty of GetField() and Evaluate() performs the controls. By doing this I’m able to update codes and logic updating the relevant tables. The interesting part is however how to update the Codes and the Controls table.
Remember, I needed something rather simple to run, with the possibility of choosing single tables, and “atomic” enough to avoid any problem about data integrity.
The Wrap & Unwrap technique
I therefore developed a technique, that I dubbed wrap & unwrap; it basically creates a single block of text containing the update data.
The data are wrapped into an XML shell, encapsulating each data element (table, record, field) into appropriate XML tags. The block of text can be sent to the remote location, where it will be unwrapped, thus recreating the original records.
Warning for those worried about XML: don’t worry, it’s elementary XML.
Let’s see the technique in detail, using the tip files accompanying the article. In the tip files I simulated a situation in which you have a central location (wrap.fmp12) that will send updates to a remote location (unwrap.fmp12); two tables will be updated, Products…
The script starts from Layout number 1, and proceeds layout by layout. When the layout name is among those selected in a global field the table data gets processed.
Note: in the tip file I’ve omitted the layout choice for the sake of simplicity. Should you want to do that you can point the value list to a calc field LayoutNames ( “” ) that will return the names of all the file’s layouts. In order to have updated layout names you’ve to set the calc to unstored first, and then back to stored. Alternatively use a hard coded list of suitable layouts.
Create Update record, write update text in a field and also export it into an .xml file
The table processing performed by script “gather data and wrap them in an XML shell” does the actual data collection, an outer loop traversing all records, and an inner loop going thru the fields found on the layout; each field content gets wrapped with an XML tag made by the field name itself, using a custom function. See other comments inline.
As you can see the actual wrapping is made using a custom function named PassXML that was originally created by Andy Knasinski:
The final result is a block like…
The text is thus ready to be sent to the remote site, where it will be used to update the file’s table.
NB: for the sake of simplicity in these tip files the data transmission has to be made by copying/pasting the XML text block.
Once the data has been received at the remote site we unwrap it, in order to update the target tables.
The script we run performs basically a reverse engineering of the wrapping procedure.
As soon as a <table> tag is found the script processes the records/fields of the corresponding layout.
The “Process update data” loops thru the text pertaining to the table, an outer loop processing each record, and an inner loop processing each field, using the XML tag to identify the field and a Set field by name step to write the data.
Previous steps put into a $data variable the table data.
The custom function doing the unwrapping is named ExtractXML, and also it was originally created by Andy Knasinski:
Advanced methods of updating
Copy/paste of the text block works mainly for the purpose of the tip file.
In real world situations it would be better to implement more reliable methods, less dependent on the user interaction.
In this case we might…
- send the XML file to the remote location, and run a script that reads its content (it requires a file plugin like the free Base Elements plugin)
- put the XML file on a web server, pass to the user the link to the file and use an Insert from URL step to retrieve it
Benefits of the Technique
Dynamicity and scalability
The wrapping procedure is modular, abstracted, so that it can be applied to any table since it will gather table and field names from the context itself.
It is therefore sufficient to include any layout among those to be processed to have the data recorded for later transferring, no other line of code or parameter needed; similarly it is sufficient to add new fields to a layout to have them processed.
Also the unwrapping part of the technique is dynamic: no table or field names are hard coded, all script steps use values derived from the XML text.
It’s very simple to apply this technique to your own file. In fact you just need to copy/import 2 custom functions and 4 scripts.
Using the wrap/unwrap technique all data ends up in a single text block that can be transferred atomically, thus allowing transactional integrity, something not possible with import scripts.
This complies with the FileMaker Sync Guide – v12.0 when it states “Because of the possibility of losing network connection, it is important that the sync scripts complete all changes to a given record or group of records before commit. […] This will ensure that the complete set of changes is transmitted in a single action, providing transactional integrity.”