Closing out
This is the final entry to this blog, In the last blog I explained some applications and the Internet of Things feature of the Steam Punk Hat as well as all the features we had on it.
In this blog I want to pass over the work as well as share a final demo of the product we created.
To summarise in 10 weeks we went through 5 iterations:
- Mapping movement,
- Working with heart rate sensors,
- Breath Mapping,
- Cloud dashboard design and feeding everything in.
- Speaker and sounds from a dfPlayer Mini module.
Every iteration had some creative elements apart from the coding, A lot of brainstorming and designing even 3D printing. The mapping of these sensors was mapped to hardware such as buzzer sounds, lights and servos. The main challenge was to make a stable and reliable mapping.
A new addition to this iteration was this DFPlayer Mini board that allows to play sounds from an sd card and comes with a microbit extension.
What do we have in the end?
A smart wearable device that has applications across many industry sectors like fashion, healthcare and electronic enthusiasts. A wearable hat equipped with a speaker and a moving bellows, a cloud interface, a steam punk mask with a tether to the hat. The mask has a microphone that maps the users breath to a steam engine sound and movement of the bellows.
The product also includes an ear clip with a pulse sensor and heart monitor that maps the users actual heart beat. This was intended to be embedded to a piece of clothing in the future.
How to set up the project
In this blog I will only talk about the main brain on the Hat not the pulse sensor implementation as I have not figured that out successfully and in the final version it was implemented by the Lecturer, Jason.
To set up the project like I have you will need:
- Microbit MQTT cloud Board (see images further down)
- DFPlayer Mini board for music and speaker,
- Sensor board
- A Microbit
- speaker,
- SD card
- servo,
- Microbit Sensor board,
- Wires,
- Steam punk elements not necessary for the technical part. Visual.
Good thing about this iteration is it all runs on a single microbit that talks to the cloud.
With this iteration there are only 3 things you need to connect to the microbit cloud board that looks like this:
This board is what enables us to connect to the internet because it has an ESP 32 on board. I has a lot more cool features you can see in the image above. I am interested in the GPIO, servo and WIFI functionality. It also has a battery powering the entire system.
First connection:
This is the simple one take the servo and connect the three wires to the s1 pins on the board there two of those s1 and s2, I am using s1 and make sure that the black or ground wire that indicate ground are connected together.
Second connection:
Here I am connecting the sensor board to the cloud board, I had this board fitted in the mask and wired through like a tether to the cloud board on the hat. For initial testing all you need to do is 3 connections to replicate this project.
The reason for using this board is because it has a better microphone than the microbit and it can distinguish between the level of breathing out and in a lot better.
The 3V pin is connected to any 3V pin on the microbit cloud board, the ground pin is connected to any ground pin on the board and the microphone pin is connected to P2 on the cloud board.
Third connection:
This connection will be the most challenging, There is a bit of set up with the DfPlayer Mini, First I'll show you how to connect it.
The spec for this module says it needs 5V, I found that this depends on what speaker you use if you use a small speaker with an amplifier like a bluetooth one I used you will be able to use the 3V line on the cloud board.
The RX and TX pins connect to pin 13 and 14 on the cloud board for serial communication.
How this tiny device works if you need to give it an sd card with some music files in it and it will play them indexing them by the order that they were copied over.
This YouTuber provides a sample program that will essentially get you to create mp3 player from a microbit.
Programming
This section will walk you through the final version of the code that can be found here:
On Start:
I will be sharing the code not the blocks as some code is not picked up by the blocks on start.
let moving = 0
let stepCounter = 0
let movingP = 0
let stepTime = 0
let lastStepTime = 0
let currentStepTime = 0
let oneShot = 0
let _3D = 0
let ZZ = 0
let YY = 0
let XX = 0
let z = 0
let y = 0
let x = 0
let headDown = 0
let breathOuttime = 0
let breatheOutEnd = 0
let songIndex = 0
let breatheOutStart = 0
let toggle = 0
microIoT.microIoT_WIFI("xxxx", "xxxx")
microIoT.microIoT_MQTT(
"xxxx",
"xxxx",
"IOTAPPS/senseHat",
microIoT.SERVERS.Global
)
microIoT.microIoT_add_topic(microIoT.TOPIC.topic_1, "IOTAPPS/stepState")
basic.showIcon(IconNames.Heart)
basic.clearScreen()
music.setVolume(127)
dfplayermini.connect(SerialPin.P13, SerialPin.P14)
dfplayermini.setVolume(25)
These are the variables that are initialised.
most are self explanatory the ones at the top are to do with movement. One Shot is the step toggle when someone takes a step. The X Y and Z values are used for the accelerometer in a function that creates a 3D acceleration Vector value. The next variables are to do with creating and the breath tangle and times. Note the code for creating here is changed slightly and now only sends the breath out value.
Setting up the Internet and MQTT :
To set up the wifi you need to provide it with your hotspot or wifi name and the password and after with your Beebotte API keys or other MQTT platform you are using and then give it its default topic 0.
For debugging ad a heart icon showing that it connected to the internet sucesfully, the board will aslo turn on a green LED when connected.
Setting up DFPlayer Mini:
Last part in the on Start sets up the DFPlayer MINI the Serial pin connections for serial and the volume.
Main Loop Code
This is where it all happens the breathing sensing, movement and sending of messages to MQTT.
basic.forever(function () {
serial.writeLine("" + Sensor.soundLevel(AnalogPin.P2) + ",")
if (Sensor.soundLevel(AnalogPin.P2) < 23) {
toggle = 1
}
if (Sensor.soundLevel(AnalogPin.P2) >= 25 && Sensor.soundLevel(AnalogPin.P2) <= 70 && toggle == 1) {
toggle = 0
microIoT.microIoT_ServoRun(microIoT.aServos.S1, 0)
led.toggle(0, 0)
} else if (Sensor.soundLevel(AnalogPin.P2) >= 95) {
breatheOutStart = input.runningTime()
microIoT.microIoT_ServoRun(microIoT.aServos.S1, 75)
led.toggle(4, 0)
}
songIndex = 1
while (Sensor.soundLevel(AnalogPin.P2) > 95) {
dfplayermini.playFile(songIndex, dfplayermini.isRepeat.No)
basic.pause(50)
if (songIndex == 7) {
songIndex = 1
} else {
songIndex = songIndex + 2
}
}
breatheOutEnd = input.runningTime()
breathOuttime = Math.round(breatheOutEnd - breatheOutStart)
The first part of the code focuses on the breathing. I write out the value to serial so I can see it on my plotter and then I toggle based on levels. These are the levels I found best for my breathing. Essentially we want to toggle when we go down below 23 meaning we came down from a breath out peak and are about to breathe in which is a lower threshold from about 25 to 70 its counted as breathe in and 95 and above is breathe out. On both triggers I activate the servo and set it to a different angle so that it will alternate. On the breathe out I set the breathe out variable to the running time this will later be used in the calculation for breathe out time.
The while loop at then end keeps in check if the max volume is flatlining meaning the person is breathing out and its incrementing and resetting the song index to keep the train sounds playing.
The train sounds can be found here: https://drive.google.com/drive/folders/10utZi9UFfIeH449arsU7tO55-T4Z6Rfz?usp=sharing
At then end of the while loop the time reading is taken and its subtracted from the initial breathe out start giving us a breathe out time only when its breathing out, when its breathing it the time is very small because the code doesn't enter the while loop.
if (input.rotation(Rotation.Pitch) <= 40) {
headDown = 1
} else {
headDown = 0
}
This small bit of code checks the horizontal position for the microbit for a tilt measuring is the person is slouched or not.
x = input.acceleration(Dimension.X)
y = input.acceleration(Dimension.Y)
z = input.acceleration(Dimension.Z)
XX = x * x
YY = y * y
ZZ = z * z
_3D = XX + YY
_3D = _3D + ZZ
_3D = Math.sqrt(_3D)
_3D = Math.round(_3D)
if (_3D > 1250 && _3D < 2300 && oneShot == 0) {
currentStepTime = input.runningTime()
if (lastStepTime != 0) {
stepTime = currentStepTime - lastStepTime
if (stepTime > 1000) {
movingP = 0
} else {
movingP = 1
}
}
lastStepTime = input.runningTime()
oneShot = 1
stepCounter += 1
} else if (_3D < 1250) {
oneShot = 0
}
if (stepCounter > 3) {
moving = 1
}
This is the movement part of the code. The top part is a calculation squaring the acceleration in all directions to get a 3D acceleration vector and adding them. next some logic is added to pick out the step in the movement of the microbit. There were many tests performed that are described in the other iterations of this project. Things like the step toggle, step time and moving indicators are updated here.
microIoT.microIoT_SendMessage("{\"" + "posValue" + "\":" + ("" + input.rotation(Rotation.Pitch)) + ",\"ispublic\":true}", microIoT.TOPIC.topic_0)
microIoT.microIoT_SendMessage("{\"" + "stepCounter" + "\":" + ("" + stepCounter) + ",\"ispublic\":true}", microIoT.TOPIC.topic_0)
microIoT.microIoT_SendMessage("{\"" + "stepToggle" + "\":" + ("" + oneShot) + ",\"ispublic\":true}", microIoT.TOPIC.topic_0)
microIoT.microIoT_SendMessage("{\"" + "posture" + "\":" + ("" + headDown) + ",\"ispublic\":true}", microIoT.TOPIC.topic_0)
microIoT.microIoT_SendMessage("{\"" + "stepTime" + "\":" + ("" + Math.round(stepTime)) + ",\"ispublic\":true}", microIoT.TOPIC.topic_0)
microIoT.microIoT_SendMessage("{\"" + "moving" + "\":" + ("" + movingP) + ",\"ispublic\":true}", microIoT.TOPIC.topic_0)
microIoT.microIoT_SendMessage("{\"" + "b_out_time" + "\":" + ("" + Math.round(breathOuttime)) + ",\"ispublic\":true}", microIoT.TOPIC.topic_0)
})
The final bit of the code completes everything by sending out all the messages to MQTT. Pay attention to the format of the string that it is sent as a Json string so that it can be parsed by MQTT dashboards easily.
This wraps up the programming part of the blog.
Once you understand the code and have the devices set up the whole project should run without any issues. Remember to use the code link provided as that works with Beebotte.
Final Demo
Future improvements:
There is definitely a great scope for this project moving forward. This was the very first steam punk hat created in the applied robotics lab and was mainly to experiment the possibilities and the direction of the project. In the future it would be nice to integrate the pulse sensor onto the hat and figure out a way for bluetooth or phone controls from the microbit as currently the cloud board extension disables the use of bluetooth comms.
More Steampunk gears aesthetic, we have 3D printed gears as part of an iteration but we have not included them in the final design I think it would be a nice addition in the future.
Conclusion:
In conclusion I see this project as a big success. Looking back we started from 0 and we were fortunate to have started working with George who gave us initial ideas and guided us through explaining the meaning of steam punk and what perambulation is in the steam punk world and how now you can walk and breathe and based on your breathing the hat will give steam train sounds. I was a fun project to work on it developed my design skills, team working skills and electronic and programming skills because when it comes to micro controllers there is a different way of designing a system compared to a mobile application.
Thanks,
Dominik Wawak, 20089042 2023.
Comments
Post a Comment