Bootstrap process reading out of date Config caches

We use the Toolkit Manager bootstrapping process to get our sgtk when running farm jobs. Our Pipeline Configuration descriptor defines a git repo and specific commit hash. We update this hash regularly to release changes to our hooks, templates etc.

If we update the Pipeline Config Descriptor on Shotgrid Web and refresh our Shotgrid desktop project/configuration, the new changes get checked out into the bundle_cache folder as expected.

However when we use the Toolkit Manager, the SGTK just fetches the descriptor on disk. This file remains out of date:
C:\Users\<user>\AppData\Roaming\Shotgun\<sg_user>\p<pid>c<cid>.basic.desktop\cfg\config\core\pipeline_configuration.yml

And querying e.g. our templates, they match those as in the commit hash in that yml, which are out of date.

I can’t see an option to force the bootstrap to reload it’s caches, but the Toolkit Manager does appear to have knowledge of the server, via the script user, so it should be able to?

What am I missing?

Is this not a problem that we should be having? It’s causing quite a problem for our render farm - our only solution right now is to add a pre-render script to delete the cached config, so as to force the bootstrapped to fetch the config from Shotgird.

This doesn’t feel optimal and adds an unnecessary slowdown for most jobs. Can anyone advise?

I think tank synchronize_folders might solve your problem. We have has similar issues where the folder schema is out of date.

We’re running a distributed config so can’t run tank commands directly. Do you know if there is an equivalent?

@joejf please check synchronize_filesystem_structure this is equivalent for that tank command

1 Like