Easy way to copy all Toolkit configurations from one project to another

Hi all,
we modify some configurations in one project on the SG Toolkit:
Added Cinema 4D
Changes Slate publishing Nuke
Publish camera on Maya.
Publish obj
and more…

What will be the easy way to copy all these, without modifiyng again on a Toolkit install fresh?

Copy the folder, change the roots.yml, unregister and register again, will work?

Many thanks


How do you have your configuration set up currently? Is it a distributed config or a centralized config?

If you’re using the setup wizard, it will allow you to pick a different project as the source. If you’re using distributed configs, it’s probably just as easy to copy the config, depending on how you have it set up.

Usually though you would store and track your changes to the master config in something like git, and that would be your source (separate from the project). This is good practice as you see the history of your changes and roll back if necessary.


Is centralized on a shared server folder.


OK, so then the picking a project when running the advanced setup wizard would work for you.

However as I mentioned I think it would be better to have a config in git that you use as the source and update. Just make sure to ignore the following files when committing to git:

The same would apply if you were copying from one config to the other, just don’t copy those files, as they are specific to the project.


Ok Copy & Paste to another folder and delete these files and setup up the path on the project Shotgun, no?

Thanks Philip!


I would copy the config to another folder, delete those files, and then turn it into a git repo and push to something like GitHub.

Then in the advanced setup wizard you would pick the git option, or you could pick the path option.


Hi Philip

This thread holds interest to me too.

I’m interested in switching over to GIT based management of our pipeline configuration, but in all honesty I’ve found the documentation on the topic somewhat confusing.

I’d really appreciate a few specific pointers from you, based on our current setup;

  • We have a ‘dummy’ project on our server; sgtk_dev
  • It is a centralized project.
  • We have pipeline configurations attached to this project (we do our dev under those configurations).
  • Once dev is ready to push; we push_configuration back to the primary pipeline configuration on sgtk_dev.
  • At this point, the only thing updated is our dummy (sgtk_dev) development project.
  • The next step is bespoke;
  • We run a windows batch file, per-project; literally copy+paste the pipeline config from our dummy project to the chosen production project.
  • The above process makes date-stamped backups of all files prior to the copy. It is robust.
  • All of our projects are created with the centralized setup method.

While this approach has worked really well, I wanted to see if there’s merit in shifting over to something different. For the long-term foreseeable future, only 2 people will touch the configuration, which is why I’ve settled on a bespoke, hand-crafted roll-out approach. I’m happy with the process generally speaking, as it is very quick to iterate code changes.

Also, by pushing per-project, it mitigates a system wide bug being introduced. Still, version control in the traditional sense has benefits that I’ve not leveraged in this workflow. I see a primary benefit in granular detail on code changes (which I simply don’t have).

If you can spare some thoughts it’d be great!

Thanks, clinton.


Hey Clinton,

your system sounds like something I do.

To use Git in this instance I would create a git repository for each of your dev pipeline configs and the primary.

That way you can track changes in your various dev branches and properly backup and push commits to your primary branch and version them accordingly.

It just helps you keep track of changes in a better way than doing it all by hand.

I also usually set up beta’s for each project (so I can test in a specific project env) and put those on github too, just to make sure I’m tracking changes for myself and the team.

I think for everyone it’s a bit different in the way they manage Shotgun pipelines as some studios do many small projects and therefore you need a “generalized” pipeline setup that you can quickly update and push out to all projects vs studios that run only one or a few projects at a time.


Hi Ricardo

This is a great suggestion; a half way house between no code management and everything being under source control.

I like it and I think it’ll work well for the setup I have going.

Thanks for your thoughts!


Agreed, this is the way to go.

Our central git repo of project config is bare, it only contains the history and no checkout directory. This guarantees it will not be touched directly. It can be pushed to.

Each project has a checkout of that repo (which is how it works in the wizard when you select the git option).

Changes can be performed in any checkout and pushed to the central one.
Usually this will happen in my sandbox checkout (which is local on my disk), which is pointed to by the dev pipeline configuration, and pushed.
Then each project can pull the changes, which gives you a great deal of control of the distribution - you can freeze changes to a project which is in a critical phase, for instance.

Managing this is not always trivial, but it is much more workable and visible than a collection of copying scripts.
A major upside is the history of changes you get, where you can see the difference between any two points. This can make tracking bugs down significantly easier.