Creating a Mood Detector using Lobe and a Raspberry Pi

I’ve recently been experimenting with Lobe, this is a remarkable app that democratizes AI by providing the ability to build a Machine Learning (ML) model in less than ten minutes, the beauty is that this does not require any ML or coding experience. You can find out more about it at Lobe Tour | Machine Learning Made Easy.

I’ve always been really interested in self-improvement and understanding more about myself, one aspect of this that really intrigues me is my mood throughout the workday, this can go from elation to despair, and I’ve never quite figured out what the key drivers are for this (although I do have some ideas).

My love of overcomplicating the simple, led to me developing an application to record my mood throughout the day with my Raspberry Pi 4 and it’s camera. The plan would be for Lobe to analyse my mood using pictures captured by the Pi camera.

The Pi and its camera were already sat on my desk staring at me, so perfectly placed.

I wanted to be able to take a picture of myself using the Pi, have Lobe recognise my mood and log the mood along with date/time, then later I could analyse this data for specific patterns, correlating with my work calendar for additional insight. I wanted to know – is it just me having a case of the Mondays or are there specific times of the day, activities or projects that drive my mood?

To get started I headed over to www.lobe.ai, downloaded the Windows app (it’s also available for Mac) and used this to take some pictures of me in two moods (positive = thumb up / negative = thumb down). I took the pictures using the Webcam attached to my Windows 10 device, I then tagged the images and let Lobe works its magic on training an ML model.

I then selected Use and was able to evaluate the model real-time (with a surprising level of accuracy!). Once I was happy with everything, I exported the model as TensorFlow Lite – which is the best option for a Raspberry Pi.

I then copied the TensorFlow Lite model (which is basically a folder with a bunch of files within) to my Raspberry Pi. The next step was to install Lobe for Python on the Pi by running the following.

wget https://raw.githubusercontent.com/lobe/lobe-python/master/scripts/lobe-rpi-install.sh
sudo ./lobe-rpi-install.sh

Now everything was up and running I used the sample Python script available here to test Lobe with the model that I had just created using some sample images I had, this worked so I moved on to creating a Python based Web application using the Flask Framework.

Here is the finished app in all it’s glory! All I have to do is launch the site click Capture Mood, the Pi camera then takes a pic, runs this through the ML model created using Lobe and confirms the mood detected (along with a button to capture the mood again), in the background it also writes the detected mood, date and time to a CSV file for later analysis.

Below is an example of the CSV output – that ten minutes sure was a real rollercoaster of emotions 😂.

This is obviously quite rudimentary; I need to extend the model to detect additional moods, however it was a useful exercise in getting to grips with Lobe and Flask.

The full solution can be found here (minus the ML model) – to save you a click, below is the Python code (MoodDetector.py):

from time import sleep
from picamera import PiCamera
from lobe import ImageModel
from flask import Flask, redirect, url_for, request, render_template
import csv
import datetime
app = Flask(__name__)
@app.route('/')
def button():
    return render_template('button.html') # Display the capture mood button, when clicked redirect to /capturemood
@app.route('/capturemood') # Take a pic, analyses and writes output to HTML and CSV
def capturemood():
    camera = PiCamera()
    camera.start_preview()
    sleep(2)
    camera.capture('mood.jpg') # Take picture using Raspberry Pi camera
    camera.close()
    model = ImageModel.load("Mood TFLite") # Load the ML model created using Lobe
    result = model.predict_from_file("mood.jpg") # Predict the mood of the mood.jpg pic just taken 
    now = datetime.datetime.now()
    date = now.strftime("%x")
    time = now.strftime("%X")
    moodCSV = open("Mood.csv", "a")
    moodCSVWriter = csv.writer(moodCSV) 
    moodCSVWriter.writerow([date,time,str(result.prediction)]) # Write the date, time and mood prediction to the Mood.csv file
    moodCSV.close()
    #Vary the HTML output depending on whether the prediction is positive or negative.
    if str(result.prediction) == "Negative": 
        return """<div class="buttons"><p>"Mood is Negative"</p>
        <a href='/capturemood'><input type='button' value='Capture Mood'></a></div>"""
    elif str(result.prediction) == "Positive":
        return """<div class="buttons"><p>"Mood is Positive"</p>
        <a href='/capturemood'><input type='button' value='Capture Mood'></a></div>"""
if __name__ == "__main__":
    app.run(port=80,host='0.0.0.0')

…and here is the supporting render template that I created (button.html).

<html>
<body>
<div style='text-align:center'>
    <a href='/capturemood'><input type='button' value='Capture Mood' align='Center'></a>
</div>
</body>
</html>

One thought on “Creating a Mood Detector using Lobe and a Raspberry Pi

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s