When working with Azure Table Storage and needing to bulk insert a number of records, it's more efficient and reliable to do so by inserting a batch of records in one step, rather than one at a time. On a recent project we've been using table storage as a location to record the log for an import operation migrating information from one content management system into another. As each field was imported a couple of log lines of information were recorded - which were useful for auditing and debugging. Rather than inserting each log line as it was generated, we've batched them up, and then loaded them into table storage in the finally part of a try/catch that sits around the import logic. In this way, we store all the log information in one go. For a few imports though, we found this error being recorded in the application logs: Error processing message, message: An error has occurred. Exception message: The maximum number of operations allowed in one batch has been
Senior Developer and head of DXP at Umbraco. Previously with Zone, building solutions primarily on .NET and using Umbraco, EPiServer and Sitecore CMS. This blog is used as a repository for various tips, tricks, issues and impressions drawn from the use of technology my work and interests. All words are my own.