Following on from Stephen's debatching sample, I thought I'd mention a side-effect of envelope de-batching that can cause problems if it isn't appreciated in the design. One of the strengths of the envelope over xpath debatching is the fact that messages are subsequently processed in parallel, which can lead to performance gains. If an envelope containing 10 messages is sent through a debatching pipeline (e.g. XMLReceive), 10 messages are delivered to the messagebox, and 10 orchestration instances are created.
If the debatched messages are picked up by an orchestration, which in turn consumes a web service, all 10 requests are made in parallel. Which is fine if the envelope contained ten messages. Not so fine if it contains 1000. This approach can turn BizTalk into a very effective Denial-of-Service launch pad, as IIS on our (recovering) development server can testify.
We still favour this approach, but decided to mitigate the burst load by spacing the requests out over an arbitrary period of time. This is achieved by applying a random delay to the orchestration prior to the request, using a configurable period (stored in BTSNTSvc.exe.config), e.g. we want the 1000 requests to be spread out over 5 minutes, so the orchestration picks a random number between 0 and 300 (seconds), and applies the relevant delay.
This works extremely well, and has allowed us to fine-tune our implementation to fit the environment, combining the power of parallel message processing with a more even spread of the load.
A couple of additional points worth noting:
1. The messages are delivered in random order in this case, as with all envelope debatching scenarios.
2. The System.Random uses a time-based seed, which seems to rely on millisecond values, meaning that messages arriving within the same millisecond (does happen, apparently) will get the same delay. This can be mitigated by implementing a singleton, if it's really necessary.