I am working on a SG pipeline app that will be available as a toolkit bundle through SG desktop.
Be aware that we are using a distributed config and most of our users are remote, so we do not have control over their workstations.
I am stuck at the point where I want to include third party Python modules in my app. Let’s say I would like to import the
In an usual Python script, you would import it like this:
As I am running on this app through SG Desktop, it will run through the shipped Python. As the yaml module is part of the shipped Python, I am importing it this way:
from tank_vendor import yaml
But what if I want to use a module that is not part of SG Desktop shipped Python?
- I have seen toolkit bundles app that contains a local copy of all the modules they need. This seems to be a lot of bad practices at once for obvious reasons.
- I assume a workaround would be to use
pip install within the app code to install all required modules, but once again, this seems quite a bad idea.
- Is there a clean and built-in way to do this? I am especially thinking of a module that would be required in two separate apps. My guess would be there is a way to list the necessary modules in the pipeline config in some ways.
You can just create a folder inside the config root called
python (or whatever structure you want in there, I have
Then in the
pipeline_configuration_init core hook you can run some code to include that folder in the pythonpath.
Make sure to check if the folder is already in the python path just because the
pipeline_configuration_init runs a bit more than once per SG desktop session and we dont want duplicates.
Thank you for your answer, Ricardo. This would be a interesting solution to share modules between all our apps without shipping duplicates.
Am I also interested in knowing if there would be a valid solution where we would not ship any module we do not write. My idea would be a clean way to
pip install the modules we need. Are there any good reasons not to do that?
I know this would delay the first launch of each apps that needs additional modules, and slightly delay an app launch if the requirements are checked every time.
I did some manual (as opposed to automated in the config /code) tests yesterday and this works as expected, but the thing I am not sure about is where to trigger those pip install from.
I assume we could also theoretically use the
pipeline_configuration_init core hook to trigger the pip install of a module. Am I correct?
In some ways it makes sense to install the dependencies in Shotgun’s Python, e.g.
C:\Program Files\Shotgun\Python3\Lib\site-packages on Windows. One issue is that you need to be Administrator to write in this directory by default.
One random idea - have a look at
./install/core/python/tank/commands/cache_apps.py (which downloads ShotGrid bundles, defined in the env config). Potentially you can add your own command to install all dependencies. It would be nice if the bundle can itself declare what dependencies it needs installed.
Edit: just saw this other thread that has tons of related info. There are
rez are ways to do this but for just python modules that you want to ship with the config I would include them in a folder, its the easiest way and allows you to set that folder as a source root in your IDE very easily (so you get code completion etc).
One idea that I always wanted to try out is adding 3rd party libraries into a custom toolkit framework. That framework could then be uploaded to SG and maintained like any other custom toolkit app and could be used anywhere in SGTK-land.
This could prove tricky since toolkit does like to import python modules in a certain way, but depending how you set it up there are ways around that…
I’d only recommend this approach if you really only want to maintain some third party python libraries. If you are going into deploying plug-ins, programs or custom studio code I would still recommend
Hope that helps a little bit