Using PowerShell to Write Data to Azure Table Storage

As part of my continued quest to over-engineer and complicate things, I decided to update a script I’d recently written that performs a regular speed test of my Internet connection to store the results in Azure Table Storage rather than a local CSV file.

This was the first time that I’d used Azure Table Storage and I was pleasantly surprised how easy it was to integrate into my script, after creating the Storage Account via the Azure Portal and creating a table (named “perf”), it was simply a case of doing the following.

Step 1 – Install the PowerShell Module for Azure Table Storage

Install-Module Az
Install-Module AzTable

Step 2 – Connect to the Azure Storage Table

Before attempting to connect to the table, I needed to obtain the Access Key. To do this via the Azure Portal, I selected the Storage Account and then Access Keys, and then hit Show Keys and took a copy of the key.

I then needed to connect to the table using PowerShell (using the key obtained above), to do this I ran the following:

$StorageAccountName = "Storage Account Name" # Enter the name of the storage account e.g. "BrendgStorage"
$Key = "Access Key" # Use the Access Key obtained via the Azure Portal
$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $Key # Connect to the Storage Account
$Table = (Get-AzStorageTable -Context $StorageContext | where {$_.name -eq "perf"}).CloudTable # Connect to the Perf table

Once this completed (without errors), I verified the $Table variable. This confirmed that I had successfully connected to the “perf” table.

Step 3 – Update Speed Test Script

I then needed to incorporate the above into the Internet test script and add logic to add the output of the Speedtest CLI tool to the Azure Table (Perf) rather than a CSV file.

The updated script can be found below, in lines 1-4 it connects to the table named perf, it then runs a continual loop that runs speedtest-cli and adds the output to the perf table using Add-AzTableRow (which it repeats every 5 minutes). As I like to live dangerously, there’s no logic to handle failures 😀.

As this is a super-simple table I’m using a single partition key (“1”) and using Ticks as the row key, I also manually specify the data type for each of the properties as the default behaviour of Add-AzTableRow is to add as a String. I used Double as the data type for the Ping, Download and Upload properties to enable me to query the data – for example to show all entries where Ping was greater than 50 (ms).

I do some string manipulation to pull out the values from the output of speedtest-cli ($Speedtest) as this simply returns a single string containing all the test results (Ping, Download and Upload).

$StorageAccountName = "Storage Account Name"
$Key = "Key"
$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $Key
$Table = (Get-AzStorageTable -Context $StorageContext | where {$_.name -eq "perf"}).CloudTable

$i = 0
while ($i -eq 0)
{
    $PartitionKey = "1"
    $Time = Get-Date
    $SpeedTest = /usr/local/bin/speedtest-cli --simple
    Add-AzTableRow -table $Table -PartitionKey $PartitionKey -RowKey (Get-Date).Ticks -property @{"DateTime"=$Time;"Ping@odata.type"="Edm.Double";"Ping"=$SpeedTest[0].split(" ")[1];"Download@odata.type"="Edm.Double";"Download"=$SpeedTest[1].split(" ")[1];"Upload@odata.type"="Edm.Double";"Upload"=$SpeedTest[2].split(" ")[1]}
    Start-Sleep -Seconds 300
}

The script has been running for a few days now, I used Storage Explorer within the Azure Portal to view the table and confirm that data is being successfully collected. This made me realise that the DateTime property I add is redundant as the Timestamp property stores this automatically on insert.

Understanding the Table service data model (REST API) – Azure Storage | Microsoft Docs was a useful reference document as I got to grips with Azure Table Storage.