A few months back I started using Azure Tables as a quick, cheap, and simple database for a project at work. I was recently re-factoring my main Powershell script which spits data out into an Azure Table for later PowerBI reporting. Thankfully, even though it was close to midnight, my years of “ops management brain” kicked in before I pushed the “go” button on my refactored script. Thankfully it dawned on me I should be testing my updated script on a copy of my table vs. the only copy of the data I had.
Being late and me being tired, it didn’t occur to me to just try Azure Storage Explorer, which allows you to easily copy a table from one storage account to another. Instead, I turned to Powershell. What I came up with is a simple Powershell Function that creates a new empty table, reads the contents of your source table to an array and then pushes that array back into the new empty table. Using Powershell 7 batch processing, it works pretty quickly… granted my table only has like 2,000 rows. With that said, these Powershell snippets require Powershell 7, click here for install instructions.
I am planning on some later automation (because I now need a good reason for having done this via Powershell) so I created a SAS token at the storage account level and used that for authentication. That said, I figure others may just use this ad-hoc and will just use their AD Auth login to create the storage account contexts needed – so that is how the below is written up – no SAS token – just login and go.
First, in a Powershell session, make sure you have the AzTable module installed.
This is a completely separate/independent module from Powershell AZ. To install it:
Here is the Magic Function
Select-AzSubscription $SubscriptionName
$StorageAccount = Get-AzStorageAccount -ResourceGroupName $SAResourceGroupName -Name $StorageAccountName
$ctx = $StorageAccount.Context
New-AzStorageTable -Name $NewDestinationTable -Context $ctx
$SourceCloudTable = (Get-AzStorageTable -Name $SourceTable -Context $ctx).CloudTable
$DestinationCloudTable = (Get-AzStorageTable -Name $NewDestinationTable -Context $ctx).CloudTable
$SourceCloudTablecontent = Get-AzTableRow -Table $SourceCloudTable
#
$SourceCloudTablecontent | ForEach-Object -ThrottleLimit 500 -Parallel {
$DestinationCloudTableUpdate = $using:DestinationCloudTable
$_ | Update-AzTableRow -Table $DestinationCloudTableUpdate
}
}
Run the Function and Supply Parameters
It’s hopefully pretty self explanatory but after loading it and logging into Azure (Add-AzAccount) you would run it like this:
You should then see a bunch of streaming information across your console as it creates the new table and then uploads all the data into it. Be careful not to overwrite/append to an existing table (i.e. make sure your new table name is unique).
I must admit this feels a tad bit like a “Rube Goldberg” machine as Storage Explorer would have been a much better fit for my use case. That said, I enjoyed the puzzle and the learning experience.
As a side note, with a little modification you could easily use this function to create a new Azure Table and upload the contents of really -any- PS object array generated in code. I tend to use these types of arrays frequently for the projects I work on and reading/writing to Azure Tables is a nice alternative to using a local CSV file.
Cheers!
Thank you. Work fine for me