• Running a Flask Web App within a container – lesson learnt! 🫙

    Over the years I’ve created a few web apps in Python using the Flask Framework, a good (or not so good) example can be found here.

    I typically host these within an Azure Web App, recently I was experimenting with container instances within OCI (Oracle Cloud Infrastructure), I set about attempting to port one of my web apps to run within a container, I ran into a small issue that I thought I’d document here (mainly for my future self).

    When deploying a Python web app to an Azure Web App that uses Flask, I use the following code to run the web app (at the bottom of the Pythion application.py file – which contains the code for the Flask web app):

    if __name__ == "__main__":
        app.run()
    

    This works like a charm locally (running the web app on http://localhost:5000) and when published to an Azure Web App, this runs on port 443 (https), for example https://(webappname).azurewebsites.net.

    When running this app within a container, it failed miserably and the site didn’t render ☹️. Looking at the logs within OCI this was because the container was listening on port 5000 (as it would typically do when running locally).

    It turned out that I needed to update the application.py file and configure the port to listen on and override the default of port 5000 (as below – I used port 80/http to keep this simple).

    if __name__ == "__main__":
        app.run(port=80,host='0.0.0.0')
    

    This did the trick and my web app worked correctly! It looks like Azure (Gunicorn) does some magic under the hood and override’s the default behaviour of listening on port 5000.

    Whilst I ran into this issue using OCI, it would be the same if I was running Docker locally, Azure, GCP or AWS – it was an issue with me, rather than OCI 🤦‍♂️.

  • Creating a Function in the Oracle Cloud (OCI) to help me stay fit 🏃‍♂️

    I’ve recently stepped out of my Microsoft comfort zone and have been experimenting with AWS, GCP and OCI. One of my favourite features of Azure is Azure Functions.

    I wrote an Azure Function during the pandemic as I needed a way to automagically generate a workout routine, as I could no longer attend my favourite circuit class – the code for this can be found here 🏋️‍♂️.

    This is a HTTP triggered Azure Function App that generates a list of exercises for a workout (from a pool of 26 different exercises), pass the query string exercices=(number) to the Function App URL to specify how many exercises you’d like including in the workout and the function app will work it’s magic 🪄.

    As this is fairly simple, I thought I’d have a go at adapting this to run as a function within OCI. I put together a short video that walks through the process of creating a function app in OCI, deploying the code and then finally testing the function app, the walkthrough video can be found below, the Python code used can be found here.

    I was pleasantly surprised at how straightforward this was, despite a few small hiccups I managed to get this all done in less than a couple of hours ⏱️.

  • Using AI to play Typing of the Dead 🧟

    Some time ago I shared a Python script that I’d written that could complete the first level of Super Mario Land 🎮.

    Since then I’ve being thinking of other games that I could try to automate playthrough’s of. One game that I’ve never played (until recently) is Typing of the Dead Overkill, which is basically House of the Dead but instead of shooting enemies manually, you type a word to kill them, improving your typing skills whilst playing – who needs Mavis Beacon ⌨️!

    As this is fairly simple in nature it made me think that I could probably do something as follows to automate playing the game (using Python of course).

    • Take a screenshot of the game (using PyAutoGUI)
    • Pass this screenshot to the Azure AI Vision service (using the OCR capabilities) to extract the text
    • Automate typing the text, again using PyAutoGUI
    • Rinse and repeat

    I grabbed a copy of the game from Steam and set about putting my master plan into action!

    I managed to create a proof of concept for this (script available here), however it simply wasn’t performant enough so I kept dying ☹️.

    I think moving the OCR processing from Azure to my local device would make this a workable solution and is definitely something I’ll look at in the future when I have time.

    Before I threw in the towel however, I thought I’d try a low-tech approach, which to many astonishment worked really well and effectively can complete the game without any manual user input!

    This approach does the following……runs a loop that types each character on the keyboard (a-z) in order and then repeats, it’s not pretty but it does the job! I put in a 10 second pause so that the script can be launched before the game (I have two monitors and had the script running on my second monitor).

    import pyautogui
    import time
    
    time.sleep(10)
    i = 1
    while i == 1:
        pyautogui.write("qazwsxedcrfvtgbyhnujmikopl")
    

    Here is a short video of the script in action:

  • Using PowerShell to list all Power Automate Flows and the Connectors they are using

    I recently needed to export a list of all Power Automate Flows from a tenant, along with details of any connectors they were using to read/write data such as SharePoint Online, Dataverse and Outlook. I needed this for preparation for a tenant-to-tenant migration to aid with planning 📃.

    I put together the PowerShell script below, which outputs a list of all Flows along with their state (enabled or disabled) and the connectors that they use to a CSV file.

    This script requires the Power Platform Administrators PowerShell module to be installed, instructions on how to install this can be found here. Simply update the $FilePath variable (which sets the location to write the CSV file to) and then you are good to go👍.

    $FilePath = "D:\FlowsExport.csv"
    $Flows = Get-AdminFlow
    "Name" + "," + "Enabled" + "," + "Connectors" | Out-File $FilePath
    ForEach ($Flow in $Flows)
    {
        $FlowInfo = Get-AdminFlow -FlowName $Flow.FlowName -EnvironmentName $Flow.EnvironmentName
        $Connectors = $FlowInfo.Internal.properties.connectionReferences
        $ConnectorDetails = ""
        $Connectors.PSObject.Properties | ForEach {
             $ConnectorDetails += $_.Value.DisplayName + " | "
        }
        ($Flow.DisplayName.Replace(","," ")) + "," + $Flow.Enabled + "," + $ConnectorDetails | Out-File $FilePath -Append
    }
    

    The output looks like this (I’ve fancied it up a little in Excel):

    This script can also be found on GitHub

  • Collecting hardware hashes for Windows Autopilot using PowerShell (unattended) ✈️

    I needed to manually register a number of devices with Windows Autopilot, to do this I firstly needed to collect the hardware hashes of the devices.

    There is a wealth of documentation out there that describes how to do this, such as – https://learn.microsoft.com/en-us/managed-desktop/prepare/manual-registration-existing-devices#obtain-the-hardware-hash

    The challenge I had, was that the PowerShell script provided requires end-user interaction and I needed to run this script un-attended via a Group Policy Object (GPO). I managed to do this and below is my updated version of the script which writes the hardware hash to a local file on the device, in my final solution, this will write the file to a fileshare.

    Find-PackageProvider -Name 'Nuget' -ForceBootstrap -IncludeDependencies
    Install-Script -Name Get-WindowsAutoPilotInfo -Force
    powershell -ExecutionPolicy Unrestricted Get-WindowsAutoPilotInfo -OutputFile c:\hardwarehash.csv
    
  • Enabling RDP on a Windows Server using the Serial Console

    Another example of a complete edge case scenario, with little use to anyone – besides myself when I need to refer to this in the future, when I make the same mistake 😆.

    I was recently playing around with Azure Migrate and performed a test migration of a VM from On-Premises (a local Hyper-V server in my lab) to Azure ☁️.

    I’d provisioned a new VM within my Hyper-V server On-Premises, configured this as a web server and then did a migration to Azure (which was a lot simpler than I thought!). The one thing I forgot to do was enable RDP on the On-Premises VM prior to migration, I’d been using Hyper-V Manager to remotely access the VM and configure it so completely forgot to do this 🤦‍♂️.

    The result of this, was that when the VM had been migrated to Azure it didn’t have RDP enabled so I was unable to access it, this is where the serial console came to the rescue and enabled me to configure RDP and get access to the migrated VM in Azure.

    Here are the steps that I followed:

    Step 1 – Enable the Serial Console

    The first thing I needed to do was enable the serial console, the steps required are documented here.

    You need to Run EnableEMS against the VM to do this:

    Step 2 – Connect to the VM using the Serial Console

    • Select Serial Console within the Help section of the VM within Azure
    • Type cmd and press enter
    • Type ch -si 1 and press enter

    • Press any key

    • Input the credentials for the VM

    • If successfully authenticated, you should now have a command shell

    Step 3 – Enable RDP and Create a Firewall Rule to allow access

    Run the following commands within the command shell to enable RDP on the server and then configure Windows Firewall to allow inbound access.

    Enable RDP

    reg add "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Terminal Server" /v fDenyTSConnections /t REG_DWORD /d 0 /f
    

    Configure Windows Firewall

    netsh advfirewall firewall set rule group="remote desktop" new enable=Yes
    
  • Unable to get nested virtualization working with Hyper-V on an Azure VM

    More test lab building….and more issues! This time I needed to build a Windows Server 2022 running Hyper-V hosted in Azure, the plan was to use this VM to host other VMs (known as nested virtualization). I provisioned a VM using one of the Azure VM SKUs that supports nested virtualization however when I attempted to install Hyper-V on the VM it failed with the following error “Hyper-V cannot be installed: The processor does not have the required virtualization capabilities

    After much troubleshooting, I eventually figured out what the problem was, when provisioning the VM I should have configured the Security type as Standard rather than Trusted launch virtual machines.

    Re-creating the VM using this setting enabled me to install Hyper-V and enjoy some nested virtualization goodness 💪.

  • Running a Windows 11 VM on Hyper-V 🪟

    I was recently building a lab environment for Microsoft Intune, as part of this I needed to provision a Windows 11 machine as I needed to do some testing of Windows Autopilot. I decided to host this on Hyper-V (running on my Windows 11 desktop PC) rather than using a physical device to keep things simple (at least that was the idea!).

    I ran into an issue during installation and received the following error message – “This PC doesn’t meet the minimum system requirements to install this version of Windows”.

    To fix this I needed to enable TPM support within the settings for the VM, it’s also worth noting that the VM should be created as a Generation 2 VM.

    Once I’d enabled this setting I was able to successfully install Windows 11 🎉.

  • Using Azure Bastion within an Azure Virtual WAN configuration 🌍

    I was recently working on an Azure deployment which used a Virtual WAN as the hub with a number of spoke Virtual Networks (VNets). Azure Bastion was to be deployed into one of the spoke VNets and the plan was that this single instance of Azure Bastion would provide the ability to RDP/SSH into VMs hosted in the other spoke VNets within the environment (which have been connected to the Virtual WAN hub). This saved deploying Azure Bastion into each VNet – which could have been quite costly 💷.

    It turns out that when Azure Bastion is deployed into a environment that uses a Virtual WAN rather than VNet peering to connect VNets together, it cannot connect to VMs hosted in VNets outside of the VNet where Azure Bastion has been deployed unless:

    • Azure Bastion Standard is provisioned ✅
    • IP-based connection has been enabled ✅

    This is documented here.

    Re-provisioning the Azure Bastion to use Standard rather than Basic and enabling IP-based connection fixed this:

    Once this had been done, I was able to connect to VMs in other VNets, however I needed to use the IP address to connect, the process of connecting via IP address is documented here.

  • Reading the value of a SharePoint choice column within a Power Automate Flow 🤖

    I needed to read the value of a SharePoint choice column within a Power Automate Flow and then do something dependant on the value, in this case I needed to send an e-mail if the Approval status column is changed to Approved for any of the items within the list.

    Due to the way that choice fields are returned to Power Automate, I couldn’t use a simple condition that checks the value of the Approval status column and then sends an e-mail if this is equal to Approved.

    I first needed to initialise a variable that reads the value of the Approval status column and then use this variable (ApprovalStatus) within the condition….as below:

    This Flow then sprang into life and started sending e-mails when an item was updated and the Approval status column was set to Approved.