Batching Webhooks

Wondering if the discussion regarding a batching operation on Webhook’s came up in discovery while you guys designed it. It’s not uncommon for event service to get log jammed with multiple events of the same topology. IE, a user went to a page, selected 100 tasks, and changed the status. You would expect all the Webhook’s to be sent at roughly the same time. Having a batching attribute when configuring the hook would certainly clean up the event log. Another bonus to having batch attributes and operations would be simplifying Serverless integrations and lowering overall cold start times. I’d gladly wait a few extra seconds for my Webhook to process a batch update on 100 tasks rather than wait cold start times on 100 separate lambda jobs. Also, complexity becomes an issue. When you flood Lamdba with this request, you are going to probably need concurrent provisioning and an SQS queue to burst appropriately. As a result, the democratization of Serverless is lost, and your time, effort, and costs to implement skyrocket. Certainly, I can see scenarios where the jobs would need to be separated due to compute times of lambda. I may have jumped the curve with this discussion, but hoping you guys can catch up with me. This could be the type of feature that makes Webhooks and Serverless from SG more accessible to the masses.

4 Likes

Hi Romey,

It is very useful. I would ask our experts if there is anything could share.

Loney

2 Likes

Thinking about this a bit more. If SG is unable to introduce batching operations, maybe I can configure our AWS Webhook broker with a Step function to gather all the request and reprocess them in bulk, rather than one at a time. However, my concern here is that I use the session_uuid to update the web site when the changes happen. I fear that if I batch process all the event, I would lose the ability to auto update the page. Thoughts?

Hi Romey,

Wanted to say thanks for requesting this, and following up with more detail recently.
We’re still in vetting phase, but we’ve made some promising progress on offering batched event payloads.
A quick outline of some current considerations:

  • Each webhook would have an additional config option checkbox, to “Allow batched deliveries”.
  • Format for batched delivery payload JSON structure, accounting for payload size limitations.
  • Limit for events per delivery payload.
  • Possible increase for HTTP request timeout (currently 6 seconds for all HTTP request), dependent on number of events included.
  • Throttling algorithm accommodations for mix of batched & non-batched webhook configurations.

During this phase, we’d welcome perspectives from anyone in this community in this thread.

Hi Zoe,

Thanks for the follow up. It felt like my request went into the abyss. I’d certainly like to test this first hand. I am guessing the current functionality is not in a releasable state. However, per your notes I think it’s on track. It would be very helpful if you could share some example responses and or payloads here so we can better visualize what the resulting payload would look like. Having the flexibility to determine the size of the chunks would be great. Of note we use a notification platform call Sentry.io to help with error detection and notifications from AWS. It’s pretty awesome. It would be great if Webhooks gave us the ability to inject our own notification service like Sentry. This seems particularly relevant here. What happens if we accidentally make the size/number of events to large. Is the issue with SG sending batches to AWS or is it with AWS.

Thank you for sharing some progress on the board. We are eager and need this functionality to assist with large event based operations. Thanks you.

Romey

1 Like

Checking in and seeing how this effort is going. Any updates as to when we might see something in the form of a demonstration?

Hi!
We are actively working on this. In fact, we’re in the final dev phase so it should be available very soon!

Hi all, sharing here as a preview the format we’re implementing for adding support for batched deliveries.

Non-batched-deliveries Webhooks

  • timeout allowance is 6 seconds per delivery
  • i.e. the webhook endpoint must respond to each request within 6 seconds

Batched-deliveries Webhooks

  • timeout allowance is max of 6 seconds, or 1 second per event in the batch
  • throttling limits still apply (1 minute of webhook endpoint response time per minute per shotgun site, across all webhooks)

A word of caution: If you choose to use batched-deliveries, we recommend that you guarantee, with your callback endpoint design, to always respond much faster than 1 second per event. Otherwise you will be at increased risk for timeouts and webhook failure when batches are large.
For webhooks that take on the order of 1 second to respond, there is not any significant benefit in batching, because in that case webhook response time is the main performance factor, not delivery overhead.

Comparison of Webhook Delivery Formats

Non-batched-deliveries Webhook Message Body (always 1 delivery):

{
  "data":{
    "id":"119.110.0",
    "event_log_entry_id":479004,
    "event_type":"Shotgun_Asset_Change",
    "operation":"update",
    "user":{"type":"HumanUser","id":24},
    "entity":{"type":"Asset","id":1419},
    "project":{"type":"Project","id":127},
    "meta":{
      "type":"attribute_change",
      "attribute_name":"code",
      "entity_type":"Asset",
      "entity_id":1419,
      "field_data_type":"text",
      "old_value":"Cypress test asset for Webhooks deliveries",
      "new_value":"Revised test asset for Webhooks deliveries"
    },
    "created_at":"2021-02-22 17:40:23.202136",
    "attribute_name":"code",
    "session_uuid":null,
  },
  "timestamp":"2021-02-22T17:40:27Z"
}

Batched-deliveries Webhook Message Body (may contain 1 to 50 deliveries)

Note that the delivery format for each event is identical, whether batched or unbatched. However, in the batched case, the events are inside an array (even if there is only 1 event in the batch).

{
  "timestamp":"2021-02-22T18:04:40.140Z",
  "data":{
    "deliveries":[
      {
        "id":"170.141.0",
        "event_log_entry_id":480850,
        "event_type":"Shotgun_Asset_Change",
        "operation":"update",
        "user":{"type":"HumanUser","id":24},
        "entity":{"type":"Asset","id":1424},
        "project":{"type":"Project","id":132},
        "meta":{
          "type":"attribute_change",
          "attribute_name":"code",
          "entity_type":"Asset",
          "entity_id":1424,
          "field_data_type":"text",
          "old_value":"Cypress test asset for Webhooks deliveries",
          "new_value":"Revised test asset for Webhooks deliveries"
        },
        "created_at":"2021-02-22 18:04:39.198641",
        "attribute_name":"code",
        "session_uuid":null,
      },
      {
        "id":"170.141.1",
        "event_log_entry_id":480851,
        "event_type":"Shotgun_Asset_Change",
        "operation":"update",
        "user":{"type":"HumanUser","id":24},
        "entity":{"type":"Asset","id":1424},
        "project":{"type":"Project","id":132},
        "meta":{
          "type":"attribute_change",
          "attribute_name":"description",
          "entity_type":"Asset",
          "entity_id":1424,
          "field_data_type":"text",
          "old_value":null,
          "new_value":"Some other *description*"
        },
        "created_at":"2021-02-22 18:04:39.212032",
        "attribute_name":"description",
        "session_uuid":null,
      },
    ]
  }
}

The option to enable batched deliveries is now live on production. Take note of the change in request payload that will occur if you choose to enable this on an existing webhook!

Woot Woot!! Thank you. And we are just about to dive in and use it to help with custom AWS S3 uploads from our render farm.

1 Like

Actual documentation for batched deliveries is now up: https://developer.shotgunsoftware.com/e7890fc8

Hi everyone!
I was just wondering if anyone had tried batch webhooks and could report how they are working for you.

Thanks!

-Stéphane

Yes, we recently retooled some of our framework to accommodate batches.

1 Like