Running background process

Hello!

Is there a good way to run a periodically running background process while the Shotgrid Desktop is running?
I was thinking about starting an async process with time.sleep in an endless loop upon Shotgrid Desktop launch, but I’m not sure where to put the corresponding code, or if there’s a better way to do it.

How periodically do you need it to be and what sort of thing are you trying to achieve?

1 Like

@Ricardo_Musch Let’s say I’d like to run a custom python script in every 5 minutes. (like a cronjob basically)

I’m not sure if thats possible, I’m not an expert on Threading.
If it is you would be able to launch that process from a core hook depending on when it should launch and what details you need.

bootstrap.py is the earliest core hook that is run.

Another thing is that you could create a cronjob script that runs every 5 minutes and runs as a separate application, but kaunched from bootstrap.py

However what exactly is is you want it to do?

Because if it’s SG related it may already be covered by the SG Event Server or Webhooks?

1 Like

@Ricardo_Musch Yes, currently I implemented this in the bootstrap core hook (however it runs twice every time for some reason).
It’s not SG related. I’m creating a function that runs in the background and updates the 3rd party packages in the background, so when the user wants to start working, they don’t have to wait for all the package updates to download, because we already downloaded them in the background.
This is mainly to improve productivity.

Bootstrap runs every time toolkit is loaded (as far as I know).

Thats once for starting Sg Desktop, then once for loading the pipeline config.
And then once in the DCC software when it’s bootstrapping into sgtk.

Is there not a way you could push the packages to the machine from a server?
Or use a render manager that runs in the background to update the local packages?

You could also launch a looping external process that runs in the bg and does the checking.
Before launching check if the process already exists so there isn’t another one?

In general I would be carefull putting these unrelated tasks into your toolkit setup as it makes it slower.

Ideally these tasks should be handled by pushing from a server, pulling via a scheduled task (Windows) or handled by the farm software or alikes.

1 Like

@Ricardo_Musch
Of course there are ways to handle this, but bringing in new separated solutions always complicate the system, especially for something so small like this function call.
Does it make toolkit slower though? Since it runs as a separate thread I’d think it doesn’t matter, it’s detached from the main course of the toolkit. I might be wrong, I’m no expert in multithreading neither.
I think this would be nice, because this upgrade makes sure the DCC-s get their dependencies correctly, so I think it’s very much related to the whole SG infrastructure.

What sort of package manager are you using?

I think I’ll keep using rez.

I use CPENV myself and in that situation before app launch tk-cpenv will resolve the packages it needs and check which it needs to download.

Normally this takes just a second but if there is a particularly large package to download (i.e. a nuke gizmo repo) then it may take 20 secs or so.
But thats only a one time thing.

What kind of packages are you distributing that make it very heavy?

Yes, only a few seconds, but that adds to the actual loading time of the DCC, and I would like it to be as fast as possible. And this small background update could carve off seconds every time something is launched. And it’s not just the download time, it’s the package checking and evaluating time too. I know it’s not much, but it could make artists more relaxed and satisfied with their tools, in my opinion… or maybe I’m just generating problems for myself :smiley:

1 Like

I’m not sure of what OS you are running on and the general IT structure of your facility but in a Windows Domain environment I would do the following:

  1. Create a Group policy Object with a Scheduled Task

  2. Assign that scheduled task required triggers (i.e. every hour)

  3. Make it run a python script that checks for packages and copies them

  4. Assign it to a bunch of machines

This keeps this function outside the Toolkit pipe which in my opinion is better in this case.

if you really want to trigger this from Toolkit then maybe the context_change core hook is a place for it since it will get triggered loads.
But again, make sure it doesnt hold that up or toolkit will start becoming slow (i.e. clicking a task in the UI of workfiles will fire off your sync)

1 Like