• Using PowerShell to Write Data to Azure Table Storage

    As part of my continued quest to over-engineer and complicate things, I decided to update a script I’d recently written that performs a regular speed test of my Internet connection to store the results in Azure Table Storage rather than a local CSV file.

    This was the first time that I’d used Azure Table Storage and I was pleasantly surprised how easy it was to integrate into my script, after creating the Storage Account via the Azure Portal and creating a table (named “perf”), it was simply a case of doing the following.

    Step 1 – Install the PowerShell Module for Azure Table Storage

    Install-Module Az
    Install-Module AzTable
    

    Step 2 – Connect to the Azure Storage Table

    Before attempting to connect to the table, I needed to obtain the Access Key. To do this via the Azure Portal, I selected the Storage Account and then Access Keys, and then hit Show Keys and took a copy of the key.

    I then needed to connect to the table using PowerShell (using the key obtained above), to do this I ran the following:

    $StorageAccountName = "Storage Account Name" # Enter the name of the storage account e.g. "BrendgStorage"
    $Key = "Access Key" # Use the Access Key obtained via the Azure Portal
    $StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $Key # Connect to the Storage Account
    $Table = (Get-AzStorageTable -Context $StorageContext | where {$_.name -eq "perf"}).CloudTable # Connect to the Perf table
    

    Once this completed (without errors), I verified the $Table variable. This confirmed that I had successfully connected to the “perf” table.

    Step 3 – Update Speed Test Script

    I then needed to incorporate the above into the Internet test script and add logic to add the output of the Speedtest CLI tool to the Azure Table (Perf) rather than a CSV file.

    The updated script can be found below, in lines 1-4 it connects to the table named perf, it then runs a continual loop that runs speedtest-cli and adds the output to the perf table using Add-AzTableRow (which it repeats every 5 minutes). As I like to live dangerously, there’s no logic to handle failures πŸ˜€.

    As this is a super-simple table I’m using a single partition key (“1”) and using Ticks as the row key, I also manually specify the data type for each of the properties as the default behaviour of Add-AzTableRow is to add as a String. I used Double as the data type for the Ping, Download and Upload properties to enable me to query the data – for example to show all entries where Ping was greater than 50 (ms).

    I do some string manipulation to pull out the values from the output of speedtest-cli ($Speedtest) as this simply returns a single string containing all the test results (Ping, Download and Upload).

    $StorageAccountName = "Storage Account Name"
    $Key = "Key"
    $StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $Key
    $Table = (Get-AzStorageTable -Context $StorageContext | where {$_.name -eq "perf"}).CloudTable
    
    $i = 0
    while ($i -eq 0)
    {
        $PartitionKey = "1"
        $Time = Get-Date
        $SpeedTest = /usr/local/bin/speedtest-cli --simple
        Add-AzTableRow -table $Table -PartitionKey $PartitionKey -RowKey (Get-Date).Ticks -property @{"DateTime"=$Time;"Ping@odata.type"="Edm.Double";"Ping"=$SpeedTest[0].split(" ")[1];"Download@odata.type"="Edm.Double";"Download"=$SpeedTest[1].split(" ")[1];"Upload@odata.type"="Edm.Double";"Upload"=$SpeedTest[2].split(" ")[1]}
        Start-Sleep -Seconds 300
    }
    

    The script has been running for a few days now, I used Storage Explorer within the Azure Portal to view the table and confirm that data is being successfully collected. This made me realise that the DateTime property I add is redundant as the Timestamp property stores this automatically on insert.

    Understanding the Table service data model (REST API) – Azure Storage | Microsoft Docs was a useful reference document as I got to grips with Azure Table Storage.

  • The Joys of Unreliable Internet

    I’ve had a strange issue with my Internet for the last few months, it’s rock solid during the day and I have no issues at all, however from around 8pm onwards, it becomes unreliable – ping times go through the roof or I lose connectivity intermittently. This used to occur one night a week or so but for the past couple of weeks it has been happening 2-3 times a week which is seriously affecting my Netflix consumption πŸ˜€.

    I have a FTTP connection and there doesn’t appear to be a fault with the fibre connection into my property as the fibre connection light on the ONT is green when the Internet grinds to a halt. I reported this to my ISP who requested I contact them when the issue is active so that they can perform some diagnostics.

    I decided to collect some data on the issue to help me identify any patterns with this and also as evidence for my ISP. As I mentioned in a previous post I have a lot of spare Raspberry Pi’s so decided to put one of them to some good use!

    I connected the Pi directly via Ethernet to my router and wrote a quick and dirty PowerShell script that uses the Speedtest CLI Python script written by Matt Martz to perform a speed test of my connection every 5 minutes. Yes, you read that correctly – you can run PowerShell on the Raspberry Pi, here is a guide on how to set this up. I used PowerShell to call the Python script for no other reason than I’d never done it before so it seemed like a good experiment.

    Below is the script that I ran, this uses the Speedtest CLI to perform a test every 5 minutes and writes the output to a CSV file.

    $i = 0
    while ($i -eq 0)
    {
        $Time = Get-Date
        $SpeedTest = /usr/local/bin/speedtest-cli --simple
        $Time.ToString() + "," + $SpeedTest[0] + "," + $SpeedTest[1] + "," + $SpeedTest[2]  >> /home/pi/Share/SpeedTest.csv 
        Start-Sleep -Seconds 300
    }
    

    Here is what the output looks like in Excel, I’m going to collect data for a few days before I crack open Power BI and do some analysis of the data.

  • More Raspberry Pi and Container Goodness!

    Next up in my quest to learn more about running containers on a Raspberry Pi, was to test a container that I created a while ago when I was ramping up on Flask and Python. I created a basic container that generates a list of 8 random exercises (from a pool of 26), I was in the process of putting together a new workout regime at the time, so this seemed like a perfect way to build something that had practical use in my life.

    The first thing I needed to do was download the repo to my Raspberry Pi, to do this I ran the following command (from within the directory I wanted to temporarily store the downloaded repo).

    git clone https://github.com/brendankarl/Containers

    Once I had the repo downloaded locally on my Pi, I needed to change into the directory that housed the specific container I was interested in (WorkoutGenerator) as the repo has others.

    cd Containers/WorkoutGenerator

    I then needed to create the image using Docker Build, the command below references the Dockerfile within the repo and names the image “workoutgenerator”.

    docker build -t workoutgenerator ./

    Once this command complete, I could then run the image, exposing port 80 on the container to the Raspberry Pi so that I could access it from within my network. If you are interested in exposing things externally ngrok is a great free tool I’d recommend taking a look at.

    docker run -d -p 80:80 workoutgenerator

    A quick peek into Docker using the Visual Studio Code extension and I could see the container running and the images added to support this.

    Finally, I launched a browser and hit the IP address of the Raspberry Pi to check that everything was running correctly – voila it was working!

    I refreshed the page a couple of times to verify that the exercises were updated.

    That’s enough containers for me today…..I need a workout πŸ˜‚.

  • Running Docker on a Raspberry Pi

    I’ve been playing around with Docker and containers for the last year or so, primarily by running Docker Desktop on my Windows 10 device and experimenting with Azure Container Instances. I even shared one of the containers that I created on GitHub – https://github.com/brendankarl/Containers, a super-advanced Workout Generator app πŸ˜€.

    As I have more Raspberry Pi’s than I care to admit, I’m always looking for new ways to use them and reduce the guilt I feel when I see them abandoned on my desk.

    I’d read that you could run Docker on a Raspberry Pi, however I’d never got round to playing around with this…and to honest I expected it to be a bit of a palaver.

    I was pleasantly surprised how easy it was to get Docker installed and my first container running on a Pi – it took a mere six commands!

    sudo apt-get update && sudo apt-get upgrade
    curl -sSL https://get.docker.com | sh
    sudo usermod -aG docker ${USER}
    sudo pip3 install docker-compose
    sudo systemctl enable docker
    sudo docker run -d -p 80:80 hypriot/rpi-busybox-httpd

    This installs Docker and Docker Compose, enables Docker to startup automatically on boot and runs the https://github.com/hypriot/rpi-busybox-httpd image, which is a straightforward way to verify that Docker is working correctly (by running a lightweight web server). Once these commands finished executing, I launched a browser and connected to the IP of my Pi and was greeted with this – success!

    As a side note Visual Studio Code with the Remote Development and Docker extensions is a great way to do remote development and manage Docker on a Raspberry Pi from Windows or Mac.