Importing third party modules in toolkit bundle

Hi

I am working on a SG pipeline app that will be available as a toolkit bundle through SG desktop.
Be aware that we are using a distributed config and most of our users are remote, so we do not have control over their workstations.

I am stuck at the point where I want to include third party Python modules in my app. Let’s say I would like to import the yaml module.
In an usual Python script, you would import it like this:

import yaml

As I am running on this app through SG Desktop, it will run through the shipped Python. As the yaml module is part of the shipped Python, I am importing it this way:

from tank_vendor import yaml

But what if I want to use a module that is not part of SG Desktop shipped Python?

  • I have seen toolkit bundles app that contains a local copy of all the modules they need. This seems to be a lot of bad practices at once for obvious reasons.
  • I assume a workaround would be to use pip install within the app code to install all required modules, but once again, this seems quite a bad idea.
  • Is there a clean and built-in way to do this? I am especially thinking of a module that would be required in two separate apps. My guess would be there is a way to list the necessary modules in the pipeline config in some ways.
3 Likes

You can just create a folder inside the config root called python (or whatever structure you want in there, I have config/resources/python).
Then in the pipeline_configuration_init core hook you can run some code to include that folder in the pythonpath.

Make sure to check if the folder is already in the python path just because the pipeline_configuration_init runs a bit more than once per SG desktop session and we dont want duplicates.

Thank you for your answer, Ricardo. This would be a interesting solution to share modules between all our apps without shipping duplicates.

Am I also interested in knowing if there would be a valid solution where we would not ship any module we do not write. My idea would be a clean way to pip install the modules we need. Are there any good reasons not to do that?
I know this would delay the first launch of each apps that needs additional modules, and slightly delay an app launch if the requirements are checked every time.

I did some manual (as opposed to automated in the config /code) tests yesterday and this works as expected, but the thing I am not sure about is where to trigger those pip install from.

I assume we could also theoretically use the pipeline_configuration_init core hook to trigger the pip install of a module. Am I correct?

In some ways it makes sense to install the dependencies in Shotgun’s Python, e.g. C:\Program Files\Shotgun\Python3\Lib\site-packages on Windows. One issue is that you need to be Administrator to write in this directory by default.

One random idea - have a look at ./install/core/python/tank/commands/cache_apps.py (which downloads ShotGrid bundles, defined in the env config). Potentially you can add your own command to install all dependencies. It would be nice if the bundle can itself declare what dependencies it needs installed.

Edit: just saw this other thread that has tons of related info. There are tk-cpenv, shotgun-rez, etc.

1 Like

tk-cpenv and rez are ways to do this but for just python modules that you want to ship with the config I would include them in a folder, its the easiest way and allows you to set that folder as a source root in your IDE very easily (so you get code completion etc).