Are there any file storage limitation on hosted sites?

Hi, community!
We would like to use Shotgrid to allow outsourced artists to upload ABC caches to the site and download 'em afterwards to our local storage. As those would be simulation caches, chances are, they are about 3-15 Gb each. Are there storage limits for hosted Shotgrid sites? We would cleanup “synced” chaches to keep the SG instance as performant as possible.

Hi Dietmar, I’m not aware of any advertised storage limits. SG stores all of the uploaded files (anything that can be represented as a file entity/attachment) on an S3 bucket, and AFAIK anything uploaded there is persistent and there’s no mechanism for deleting it (at least none that SG exposes). Since the transfer protocol is http it’s slow, so not ideal for impatient artists or time-sensitive super large file uploads (depending on the size of your alembics), which is just as well since from what I can tell SG politely discourages using this technique for storing or transferring large digital production assets, even though there’s nothing really preventing you from doing so. If you were going to attempt this at scale for a large (remote) artist pipeline, you might want to consider using a file transfer solution like ftp/sftp, signiant, or aspera. You can also refer to this excellent thread from @philip.scadding on how to implement a remote publishing workflow like what you’ve described: Sharing published files via a cloud storage solution

3 Likes

Thanks! Exactly what I was looking for. We are not using SG intergrations, but the article you’ve linked is very helpful.