• Creating a Generative AI Agent in less than 10 minutes.

    There’s been a lot of buzz about Generative AI Agents recently, so I thought that I’d take Oracle Gen AI Agents for a spin 🧠.

    In this short video (<10 minutes ⏱️), I walk through the full end-to-end process of creating a Gen AI agent within OCI that uses the power of a LLM and business data to provide contextually relevant answers to business questions, saving users time and reducing costs πŸ‘©β€πŸ’».

  • Creating sample data in Oracle Autonomous Database using AI πŸ§ 

    I’ve recently posted a short video on YouTube that walks through the process of creating and loading sample data into Oracle Autonomous Database using AI – this feature is fully baked into the product too!

    This will be a huge timesaver for me as I create a lot of demo’s πŸ‘¨β€πŸ’».

  • Upload a file to OCI Object Storage using a Pre-Authenticated Request (PAR) πŸ“

    OCI Object Storage has a notion of a Pre-Authenticated Request, this gives users access to a bucket or an object (file) without having provide their sign-on credentials πŸͺͺ – all is needed is a single URL (which can have an expiration time/date on).

    I’ve used PAR’s to provide read access to specific objects (files) within a storage bucket, this has been useful to quickly (and relatively securely) share content.

    I recently needed to provide a user the ability to upload content to a storage bucket using a PAR, to do this I configured PAR on a bucket as follows πŸͺ£:

    However, after creating the PAR on the bucket and getting the URL, I was at a loss as to how to upload files to the bucket. If I browsed to the URL in a browser, it simply listed the files within the bucket with no visual means to upload (I was expecting a nice upload button!).

    I couldn’t see any way to upload files using the OCI CLI either, after much head-scratching and experimentation it turned out that the easiest way upload a file is to use Curl

    Here is the command that I used:

    curl -v -X PUT --data-binary '@/Users/brendan/Downloads/MyFile.zip' PAR URL/MyFile.zip
    

    You need to include the path to the file to upload (after the @ sign). The PAR URL provided by OCI and finally the name to give the uploaded file within the storage bucket.

    Running this command successfully uploaded the file to the bucket that the PAR had been created for – result!

  • Create a Machine Learning Model in Less Than 10 Minutes using Oracle AutoML β±οΈ

    As this was quite a large topic for a blog post I decided to record a video instead. In this video I go through the process of…

    1. Loading a sample dataset that contains information on employee attrition into an Autonomous Database πŸ“Š
    2. Creating a machine learning model using this data with Oracle AutoML 🧠
    3. Calling the machine learning model using a Python script 🐍

  • Connecting to OCI Object Storage using S3 Browser πŸͺ£

    OCI Object Storage has an Amazon S3 compatible API, which got me thinking that I could likely connect to it using a GUI client, such as S3 Browser. After lots of trial and error I finally managed to configure S3 Browser to connect to OCI Object Storage.

    Below are the steps that I took to get this working:

    Step 1 – Obtain the Object Storage Namespace for the tenancy πŸͺ£

    The Object Storage Namespace is required to figure out the REST endpoint URL to connect to OCI Object Storage and can be obtained via the Cloud Console > Governance & Administration > Tenancy Details

    My namespace is below and begins with lrdkvq

    Step 2 – Create a secret key πŸ”‘

    I then needed to create a secret key, this is used to autheticate to OCI Object Storage, a secret key can be created using the Cloud Console via Profile (icon in the top right) > My Profile > Customer secrey keys > Generate secret key.

    Give the secret key a memorable name and remember to copy the key before closing the window as it will not be shown again.

    In the list of secret keys, hover over the Access Key section for the secret key that you have just created and copy this too.

    Step 3 – Configuring S3 Browser βš™οΈ

    Launch S3 Browser and add a new account, select S3 Compatible Storage from the Account type dropdown.

    This then unlocks some additional options:

    For the REST Endpoint take the namespace that you obtained in Step 1 and use this as the first part of the URL, follow this with compat.objectstorage.REGION CODE.oraclecloud.com, for example my URL looks like this:

    lrdkvqz1i7g9.compat.objectstorage.uk-london-1.oraclecloud.com

    To obtain the region code for your OCI tenancy, use this reference.

    Then enter the Access Key ID and Secret Access Key obtained in Step 2. Secret Access Key is the key that is only displated once and Access Key ID is the Access Key obtained from the list of customer secret keys.

    Step 4 – Connect πŸ”Œ

    I then saved this configuration and connected πŸ˜€.

    This is a nice (and a little more user-friendly) way to interact with Object Storage without having to use the OCI Console / APIs.

  • Unable to Create a Mount Target in OCI βŒ

    A customer contacted me a few days ago as they were unable to create a Mount Target within the File Storage service within OCI, they had two Mount Targets provisioned within their OCI tenancy and were attempting to create a third, when doing this they were receiving the error:

    “File System was created successfully but Mount Target creation failed because of error: “The following service limits were exceeded: mount-target-count. Request a service limit increase from the service limits page in the console. “. To enable access to the File System, associate it with an existing Mount Target by adding an Export to it.”

    Their PAYG OCI tenant had a limit of 2 x Mount Targets, however when I looked at the documentation the limit is 2 per tenant per Availability Domain, therefore in theory they could have up to 6 Mount Targets (as there are 3 x Availability Domain’s within the region they are using).

    It turned out that they were not given the opportunity to specify the Availability Domain when creating the Mount Points and the two previous Mount Points they had created resided within Availability Domain 1. The reason for this is that they had created the Mount Targets when creating a File System and when creating them this way, it doesn’t provide an option to specify the Availablity Domain to create the Mount Target within (see screenshot below).

    To work around this they created the 3rd Mount Target manually via Storage > File Storage > Mount Targets (within the OCI Console), specifying Availability Domain 2 (UK-LONDON-1-AD-2).

    This was created successfully…..they then created the File System but this time selected an existing Mount Target (MountTarget3), rather than having the OCI Console automagically create a new Mount Target for them.

    This allowed them to successfully create a third Mount Target and File System πŸ™Œ.

  • Supercharge the OCI CLI using Interactive Mode πŸŽοΈ

    The OCI CLI is a fantastic tool that makes administering OCI a breeze! But with some many difference commands and parameters it can sometimes be a little unwieldy 😩.

    I recently discovered that the OCI ClI has an interactive mode, which greatly simplifies using it – for me this has meant less time with my head stuck in the documentation and more time actually getting things done!

    Using interactive mode is a breeze, you simply launch it using oci -i.

    Once you’ve done this, start typing the name of the command you wish to use, and it will provide auto-complete suggestions. In the example below I typed ai, it then suggested the relevant AI services in OCI that I can interact with.

    If I then select Vision for example, it provides a full list of all the actions available.

    If I wanted to use OCI AI Vision to analyse an image stored within object storage, I select the appropriate command (analyze-image-object-storage-image-details).

    It then provides details of all the parameters (those that are required are denoted by a *). I can then build up my command and run this…..how cool!

    Hopefully this helps you to save as much time as it did me 😎.

  • Detect anomalies in data using Oracle Accelerated Data Science (ADS) πŸ§‘β€πŸ”¬

    Some time ago I wrote about issues with the reliability of my Internet connection – The Joys of Unreliable Internet, one thing that came out of this was a script that I run every 30 minutes via a Cron job on my Raspberry Pi that checks the download speed of my internet connection and writes this to a CSV file πŸƒ.

    My Internet has been super-stable since I wrote this script – typical eh! However, I have a lot of data collected so I thought that I’d attempt to detect anomalies in this data, for example does my Internet slow down on specific day/times 🐌.

    Here is what the CSV file that I capture the data in looks like – I have a column for datetime and one for the download speed.

    I noticed that the Oracle Accelerated Data Science (or ADS for short) Python module can detect anomalies in time series data so would be perfect to analyse my Internet speed data with.

    The module can be installed using the following command:

    python3 -m pip install "oracle_ads[anomaly]"
    

    Once installed you can run the following to initialize a job.

    ads operator init -t anomaly
    

    This creates a folder within the current directory named anomaly with all of the files required to perform anomaly detection. I copied the CSV file with my Internet speed data into this folder (Speedtest.csv).

    I then opened the anomaly.yaml file within this directory – this contains the configuration settings for the job.

    I updated the template anomaly.yaml file as follows:

    I did the following:

    • Specified the name of the datetime column (Date)
    • Specified the target column, which include the data points to be analysed (Speed)
    • Set the location of the file containing the data to analyse (Speedtest.csv)
    • I also specified the format of the datetime column (using standard Python notation) – full documentation on this can be found here.

    I saved anomaly.yaml and then ran the following command to run the anomaly detection job:

    ads operator run -f anomaly.yaml
    

    Top tip – if you are running this on MacOS and receive an SSL error, you’ll likely need to run Install Certificates.command which can be found within the Python folder within Applications .

    The job took a few seconds to run (I only had 200KB of data to analyse), it created a results folder within the anomaly folder, within this area two files – a report in HTML format and a CSV file containing details of all of the anomalies detected.

    The report looks like this (the red dots are the anomalies detected).

    Full details of all anomalies can be found in the outliers.csv file, this also contains a score (the higher the number, the worse).

    This identified several days (along with the timeslots) that my Internet speed varied significantly from the average πŸ“‰.

    I’ll probably run this again in a few months to see if I can spot any patterns such as specific days or timeslots that download speed varies from the norm.

    Hope you all have as much fun as I did anomaly detecting! πŸ”Ž

  • Assessing the security posture of an OCI tenant πŸ”’

    I previously wrote about how ShowOCI can be used to automagically document the configuration of an OCI tenant.

    My next top tip is to run the OCI Security Health Check against your tenant. This tool compares the configuration of a tenant against the CIS OCI Foundations Benchmark and reports any issues that require remediation πŸ”.

    In today’s risky world where security breaches are a regular occurrence, it’s critical that you assess your security posture on a regular basis and perform any required remediation to ensure that you are a step ahead of the attackers – this is where the OCI Security Health check makes this a lot simpler for you (for your OCI workloads at least πŸ˜‰.)

    I ran this against my test tenancy using the Cloud Shell (it can also be run from a compute instance), with the following commands:

    Step 1 – Download and Unzip the Assessment Scripts ⬇️

    wget https://github.com/oracle-devrel/technology-engineering/raw/main/security/security-design/shared-assets/oci-security-health-check-standard/files/resources/oci-security-health-check-standard-260105.zip
    unzip oci-security-health-check-standard-260105.zip
    

    Should this link not work (e.g. if the assessment has been updated), the folder within the repo that should contain the Zip file can be found here.

    Step 2 – Run the Assessment πŸƒ

    cd oci-security-health-check-standard-260105
    chmod +x standard.sh
    ./standard.sh
    

    Step 3 – Inspect the Findings πŸ”Ž

    Within the directory that the script is run from a folder is created that stores the output of the assessment:

    In my case this was brendankgriffin_20240712102613_standard. This directory contained the following files:

    To view these, I transferred them to my local machine using the download option within the Cloud Console.

    The tool provides instructions on how to download a Zipped copy of the assessment output – this is presented when the assessment tool finishes.

    I could then open the Zip file to review the findings, the first file I opened was standard_cis_html_summary_report.html, which contains a summary of the findings of the assessment.

    It didn’t take too much scrolling to start to see some red! ⛔️

    Clicking into the identifier of a finding (e.g. 6.2) provides additional background and context, which is useful for understanding the finding in greater detail and helping with remediation planning.

    Each finding includes a link to the respective CSV file, where you can get additional details on the affected resources/configurations – below you can see a list of the resources that I naughtily created in the root compartment πŸ€¦β€β™‚οΈ.

    My recommendation would be to run the Security Assesment regularly (e.g. monthly), to proactively identify and resolve any security issues.

    That’s all for now πŸ‘‹.

  • Documenting an OCI tenant using ShowOCI πŸ“œ

    I was speaking to a customer recently who wanted to document the resources within their OCI tenancy. OCI Tenancy Explorer provides details of the services deployed within an OCI tenant, however there isn’t any way to export the information that this presents – ShowOCI to the rescue!

    ShowOCI is a reporting tool which uses the Python SDK to extract a list of resources from a tenant. Output can be printer friendly, CSV files or JSON file – this is an ideal way to document the resources deployed and configuration within an OCI tenancy.

    This could potentially also be used as a low-tech way to do drift-detection, e.g. take a baseline, and compare over time to detect any drift.

    ShowOCI can be executed directly from within the Cloud Shell making it simple and quick to run πŸƒ.

    To execute ShowOCI from within a Cloud Shell (using an account with administrative permissions to the tenant), run the following commands (taken from here):

    Step 1 – Clone from OCI Python SDK Repo and Create symbolink link

    git clone https://github.com/oracle/oci-python-sdk
    ln -s oci-python-sdk/examples/showoci .
    

    Step 2 – Change Dir to ShowOCI

    cd showoci
    

    Step 3 – Run ShowOCI: Outputting all resources to CSV files prefixed with “MyTenant”

    python3 showoci.py -a -csv MyTenant
    

    There are numerous other options for running ShowOCI, for example you can get it to only include specific resources types such as compute, some of which are demonstrated here all options are presented when running ShowOCI.py without any parameters.

    After the script has run, I had a number of CSV files with the ShowOCI directory that contain details of my tenant.

    MyTenant_all_resources.csv contains high-level details of all resources within the tenant analysed (not all columns are shown):

    There is also a separate CSV file for each type of resource that provides further details, below is an excerpt from MyTenant_compute.csv which shows all of my compute instances (not all columns are shown).

    Happy tenant reporting!