How to Automate the Backup of Azure Sentinel Tables to Long-term Storage Using Cloud Shell

Azure Sentinel customers with specific policies around data retention and the ability to retain data longer than Log Analytics allows, are interested in knowing how to move their Azure Sentinel tables to long-term storage. In a more recent blog post, Matt Lowe talked about how to Move Your Azure Sentinel Logs to Long-Term Storage with Ease. This method utilizes an Azure Playbook to accomplish this task.

I also recently wrote about how to Export and Backup Azure Sentinel Tables locally to a .csv Using PowerShell.

But a more recent functionality addition to the Azure Monitor module allows you to automate the export using Cloud Shell.

The full details around this are located at: Manage data export rules for log analytics workspace.

Here’s how this works…

  • Step 2: Run Cloud Shell in Azure and create an Export Rule for the Log Analytics workspace for your Azure Sentinel instance using the following script…
az monitor log-analytics workspace data-export create -g <YourSentinelResourceGroup> --workspace-name <YourSentinelWorkspaceName> -n <GiveYourExportRuleaName> --destination <YourBlobCreatedbytheStorageAccount> --enable -t SecurityEvent

In the Cloud Shell script above, I’m choosing to backup only the SecurityEvent table. If I wanted to backup all tables, I’d replace -t SecurityEvent with –all true

BTW: You need to use the FULL destination in the script value, which means you need to obtain the full destination path from the blob storage – which is essentially the Storage Account Resource ID. It can be located in the Properties of the Storage Account you created in Step 1. Just copy the path to the clipboard to put into the Cloud Shell script.

Getting the Storage Account Resource ID

By creating and enabling this Export Rule, it automates the process so that it causes the SecurityEvent table (or other tables you choose) to perform a backup to the Blob storage container every hour after the Export Rule creation. Note that the first time the new Export Rule runs, it backs-up the entire Sentinel table (or tables). Each subsequent time it runs, it does an incremental backup (just the new stuff).

Hourly backups of the SecurityEvent table

The SecurityEvent table will continue to be backed up to Blob storage until you issue another command to delete the Export Rule:

az monitor log-analytics workspace data-export delete --name <YourExportRuleName> --resource-group <YourSentinelResourceGroup> --workspace-name <YourSentinelWorkspaceName>
Delete the Export Rule

NOTE: Deleting the Export Rule does not delete the backups.

Additionally, make sure you check out some of the other capabilities on the az monitor log-analytics workspace data-export page. You can quickly update your original Export Rule, too. This saves time when you want to simply make adjustments without needing to delete the original rule first and generated a second, replacement rule.

With the table (or tables) backed up to the Blob Container, you can download the stored data through the Azure portal and open the data in Visual Studio (or other tool you use to read JSON data).

Download JSON
Looking at the backed-up data in Visual Studio

P.S. I’ve chosen in this example to export to Blob storage. But you can use this same method to an Event Hub Namespace or an Event Hub – which lends itself to some additional, automated functionality for exporting Azure Sentinel data to 3rd party SIEMs, if you think about it.

[Want to discuss this further? Hit me up on Twitter or LinkedIn]


One thought on “How to Automate the Backup of Azure Sentinel Tables to Long-term Storage Using Cloud Shell