Environment Variables in Storage Roots

Hi all,

I’ve searched a bit on this topic but haven’t found a great answer yet and I did see that this was potentially added to the roadmap.

I am currently setting up a project with a storage root that uses an environment variable that all users will have access to ($PROJECT_PATH).

I can get this to work correctly in tank by modifying some method returns, which I’ll list below, but it seems to have an issue resolving those files with SG once they are created. Before I went too far down a rabbit hole, is there a better way of doing this? Right now it’s just two files, but now I am looking at changes potentially to the FolderIOReceiver class and then some.

Here are the method changes I’ve made:

  • config\install\core\python\tank\pipelineconfig.py

    • Change line 794 to “project_roots_lookup[root_name] = os.path.expandvars(project_root.current_os)”
  • config\install\core\python\tank\folder\folder_types\project.py

    • Change line 84 to “return os.path.expandvars(self._storage_root_path)”

The error I then receive when attempting to create task folders from tank:

tank Task #### folders

ERROR: Critical! Could not update SG with folder data. Please contact support.
Error details: API batch() request with index 0 failed. All requests rolled
back. API create() CRUD ERROR #6: Create failed for [Attachment]: Path
ABSOLUTE_PROJECT_PATH doesn’t match any defined Local
Storage.

The folders are created on disk, but it looks like the io receiver can’t resolve the absolute paths back to a storage root in SG.

1 Like

Kudos to you for trying to solve this unfortunate shortcoming of SG.
I hope you get somewhere.

I tried modifying <project>/install/core/python/tank/util/storage_roots.py
in the function _get_storage_roots_metadata()
and had some temporary success.

I inserted a bit of code to parse the value of each *_path key in the roots.yml file for an environment variable:

def _get_storage_roots_metadata(storage_roots_file):
    """
    Parse the supplied storage roots file

    :param storage_roots_file: Path to the roots file.
    :return: The parsed metadata as a dictionary.
    """

    log.debug("Reading storage roots file form disk: %s" % (storage_roots_file,))

    try:
        # keep a handle on the raw metadata read from the roots file
        roots_metadata = (
            yaml_cache.g_yaml_cache.get(storage_roots_file, deepcopy_data=False) or {}
        )  # if file is empty, initialize with empty dict


        # resolve any environment variables stored in '*_path' keys
        for k,v in roots_metadata.iteritems():
            if "_path" in k and v:
                resolved_path=os.environ.get(v,None)
                if resolved_path:
                    roots_metadata[k]=resolved_path


    except Exception as e:
        raise TankError(
            "Looks like the roots file is corrupt. "
            "Please contact support! "
            "File: '%s'. "
            "Error: %s" % (storage_roots_file, e)
        )

    log.debug("Read metadata: %s" % (roots_metadata,))

    return roots_metadata

But since this file is not in <project>/config but the <project>/install folder, am unsure if this file is one of those in danger of being overwritten by normal SG behavior.
I’ll be following this post.

From a thread in March of 2020

Hi Logan,

Thanks for the additional information! I’ll check those storage roots out as well to see if it gets us any closer.

We’ve made a bit of progress and are able to get folder creation to work and upload correctly to SG, the one issue we are hitting now though is a database concurrency issue when launching DCC software from toolkit.

TankError: Could not create folders on disk. Error reported: Database concurrency problems: The path '$PROJECT_PATH\foo' is already associated with SG entity <Foo entity>. Please re-run folder creation to try again.

Getting here was done by changing the following lines to expand the path os.path.expandvars:

  • install\core\hooks\process_folder_creation.py line 118
  • install\core\python\tank\path_cache.py line 263

This error might be related to some of the information mentioned in the second link, poking around at it this week though.