We are aiming to ingest all the turnover we receive from clients to Shotgun as Published Files.
The question came up if Shotgun could still run into performance issues as the amount of data grows.
I am under the impression that this used to be an issue around 2012/2013 but is not the case anymore.
Anything I should consider in this case?
Both our infrastructure and our application code has changed drastically since that time period with a strong emphasis on performance. I would dare say ingesting your turnovers should be fine.
One thing to keep in mind is that if you add fields on entities, these new fields aren’t indexed by default in the database. This isn’t done automatically because we don’t know if these new fields will be used in filters and indexing has a database cost associated to it which makes indexing everything counter-productive.
This could lead to some read performance issues if you have a lot of custom fields and these are used heavily in filtering, grouping and/or sorting. If you do experience this I suggest you contact support so we can profile your data and requests and see if we can implement some mitigation measures in line with your data and query profiles.
Thanks for the quick reply!
I wasn’t thinking of adding too many custom fields so we should be fine then!
Just wanted to double check to cover myself
Still, if your turnovers are image sequences I’d suggest adding those as movies or one record per sequence instead of one record per image… but I’m sure you’re way ahead of me on that one.
( Yeah, I’m being a bit facetious. It’s Friday, gotta have a bit of fun. )
On a more serious note, the indexing stuff also applies to some default fields as well as they aren’t all indexed. So don’t hesitate to bring up specific queries or pages you feel are inordinately slow with the support team.
Great, good to know you can turn on indexing for extra fields if needed!