Level: Intermediate, Version: FM 13 or later

PSOS – Run Script in File not open Locally

Update: See Jason Wood’s suggestion in the comments section for making this technique more secure.


Today I want to take a look at a certain Perform Script On Server (PSOS) behavior, and for those who find this behavior to be inconvenient, propose a work around.

Here’s the behavior, or misconception, actually: a server side script initiated via PSOS (apparently) cannot access files on the server unless the user already has those files open locally. But of course there may be circumstances where you’d like to be able to access files server side that the user does not have open client side, either because the user’s credentials do not allow access to those files, or because you’d rather not have to open the files client side merely to facilitate a PSOS call.

If we consult the online help entry for running scripts on server, the official word is that server-side scripts can access other FileMaker files only when…

…in other words, if you want PSOS to be able to access files, you need to either a) already have the files open client side, or b) throw caution to the wind, and configure the hosted files to auto-open with pre-entered credentials.

Hmmm… I think we can agree that from a security standpoint “b” is a non-starter, and the whole point of this article is to avoid “a”. Fortunately there is a “c” option not mentioned above which I’ll get to shortly, but first let’s demonstrate the issue.


With only File #1 open client side, we want to run this script on the server in File #2. The script consists of a single step, Exit Script, which, ideally, will pass a list of open databases back to the calling script in File #1.

This script will be invoked from File #1 via PSOS like so…

…but, in this example, if File #2 isn’t open client side, then it isn’t available server side either.


This time, rather than targeting File #2 directly, the script in File #1 is going to invoke a second, server-side script in File #1 via PSOS. Once again, File #2 is not open on the client computer.

The server side script…

  1. checks to make sure it’s running on FMS
  2. does a re-login with credentials that are valid in both File #1 and File #2 (the privileges associated with these credentials should be limited in scope for common-sense security reasons)
  3. calls the script in File #2 (via Perform Script, not PSOS)
  4. passes the result back to the original script in File #1

And this time, here’s what we see:


I recommend granting full access privileges to both these scripts as a security precaution. Here’s why: scripts thus configured cannot be viewed (or edited or deleted) by a user with less-than-full-access privileges, regardless of that user’s script editing privileges.

(Thank you Charlie Bailey at Codence for pointing this out.)


9 thoughts on “PSOS – Run Script in File not open Locally”

  1. Caution! These credentials will be stored in plain text and visible in the new XML and the DDR output. You could slightly improve this by storing the credentials in fields, where you can control access with privilege sets, and which will never appear in XML/DDR output.

      1. I know you know this, Jason, but for the benefit of those who may not… to put this into perspective, XML and DDR output can only be generated by users with full access privileges. Nonetheless, the vulnerability exists, and your recommendation makes sense.

        1. That is true. The main reason I think it’s important is because we generally do not handle DDR/XML files with the same level of protection as we do with the database files. For one thing, database files can be encrypted at rest. Moreover, database files tend to live on the server, whereas DDR/XML files usually end up on local hard drives or shared locations. In addition, if you use a tool to convert FileMaker clipboard XML, you might wind up copy/pasting code into insecure locations without realizing it contains sensitive information. This is one reason I find it so annoying that we have to hard-code credentials in the Import Records and Execute SQL script steps when connecting to an ODBC data source. https://community.claris.com/en/s/idea/0873w000001QAtJAAW/detail

  2. What if the file you want available server-side is due to a relationship, not a script call?

    So, Script:A in File#1 does PSoS[ Script:B in File#1 ],
    which in turn does PS [ Script:C in File #2 ],
    but Script:C requires data via relationships from File#3, or even File#4 via File#3.

    Will the Re-Login trick work to cascade the credentials all the way to File#4?

    1. Hi Eric, I don’t see why not… you do the re-login in File #1 via PSOS, and the rest of the scripts run server side via Perform Script. I think you’ll be okay as long as the re-login credentials are valid in all the files. Please report back with your results.

  3. Using a Re-Login may address the initial problem, but it introduces another. This particular issue may not matter to some, but it’s a fairly major issue for most of the systems I work with. The issue created by this particular workaround is that modifications made as a result of the PSOS will reflect the scripted credentials rather than those of the originating User.

    Here’s the workaround I’ve used to ensure that all required files are open when a multi-file PSOS is called:
    • Each file includes an “ExitScript” script, which contains only one step: Exit Script [Text Result:Get ( ScriptParameter)]
    • PSOS scripts with external file dependencies are always run by a “launching” script.
    • The launching script first runs the ExitScript in each required file, then performs the PSOS.

    This opens any required files with the current User’s credentials, in the background, without triggering any “Start” scripts.
    The PSOS could be followed by corresponding Close File steps, but I haven’t found this to be necessary.

    1. Hi Heather,

      Thank you for taking the time to write up such a detailed explanation, and you make a very valid point re: scripted rather than original credentials.

      It seems to me that however we approach this there are going to be tradeoffs, and the method you suggest would not be viable, for example, in a scenario where for reporting purposes, certain users are allowed to retrieve selective information server side from files they aren’t allowed to open client side.

      Re: the example you mention, I think I would be inclined to stick with my approach, enhanced w/ Jason’s recommendation, be okay with a generic account name in modification fields, but mitigate this by passing the original account name and other identifying information as part of a script parameter, and document PSOS calls w/ the original account name, etc., in a dedicated log table. That way the actual modifier could be identified, if necessary.


  4. Thank you very much for your new approach to PSOS. Thank to it, I have created a simple Synchronized solution for FileMaker Go.

    File 1 (local file stored on FileMaker Go)

    File 2 (stored on FileMaker Server)

    I have created 2 simple scripts on File 2:

    *** script 1: add new record with script parameter (delimited parameter gets values from File 1 record)

    *** script 2: executeSQL for getting the records ( the parameters are PersistentID, AccountName, PrimaryKey,…) and parse the executeSQL to the script result.

    (Actually in my real solution it’s more complex than I’m describing here)

    My FMS 19 even could not catch the time ( I mean the connection) when File 1 calls the File 2’s script.

    Again, Kevin, thank you very much for your approach.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.