Loading...
Hello, my name is Mrs. Holborow, and welcome to Computing.
I'm so pleased you've decided to join me for the lesson today.
In today's lesson, we'll be using predictions from a model to control a programme.
Welcome to today's lesson from the unit "Machine learning using the micro:bit.
" This lesson is called "Building a data-driven application.
" And by the end of today's lesson, you'll be able to build a simple application that uses predictions from a machine learning model to control programme flow.
Shall we make a start?
We'll be exploring these keywords in today's lesson.
Recognition point, recognition point, the level of certainty that causes an input to match a class.
Programme flow, programme flow, the order in which a computer programme executes instructions.
Today's lesson is broken down into two parts.
We'll start by using predictions from a model to control a programme.
We'll then move on to adapt recognition points to refine a programme.
Let's make a start by using predictions from a model to control a programme.
In this lesson, we're going to be using CreateAI.
Go to oak.
link/createai.
You need to pick up where you left off, so you can either open your last session or continue the session you saved last lesson.
Note, if you don't have an existing session, you can use the data sample file provided as an additional resource for this lesson.
In this lesson, we're going to be using machine learning code blocks.
Click the blue Edit in MakeCode button.
A show icon event is included in each on ML start block.
So you can see we have the examples of a rocket, a duck, and the Pac-Man character.
Download your programme to your micro:bit.
Test the programme shows the correct icon when the model predicts each movement has started, so your good movement, your bad movement, and your null movement.
If we want to change the icons, we need to go to Edit in MakeCode.
Press the blue button to load the MakeCode editor.
You can see, we now have the three icons for the good, the bad, and the null action.
We need to download this to our micro:bit.
If we hit Download, we'll get a message which asks us whether we want to use the same micro:bit that we were using previously or a different micro:bit.
If we select the same micro:bit, we need to connect the micro:bit using the USB cable.
We then hit Next.
We then have to select the micro:bit.
And it will download the programme to the micro:bit for us.
We can now disconnect the micro:bit from the USB cable and test to see if it works.
Here's the micro:bit.
So at the moment, we have the null action, which is showing the Pac-Man icon.
If I start to move the micro:bit around, you can see that the icon has changed to the duck, which is the bad action.
If I replicate the shot, you can see that the good icon is displayed.
The icons displayed are not really related to the class names, and that's not very helpful.
Alter the icons so they provide more meaningful feedback to the user about their movement.
What icons could you use instead?
At the moment, the icons aren't particularly helpful, so let's change the icons to something that's gonna help the user to understand whether it was a good or bad movement.
So I'm gonna change the top one to be a smiley face, so this is for the good class.
For the bad class, I'm going to change it to an unhappy face.
And then for the null class, I'm going to change it to an X.
I can then re-download the programme to my micro:bit and test it again.
Izzy says, "I added a smiley face when the good class was predicted by the model.
" That's a great idea, Izzy.
Well done.
Time to check your understanding.
Why was the smiley face icon a more suitable output for the good class than the default icon?
Is it A, it represents something positive, B, it uses fewer LEDs, or C, icons need to be changed in every programme?
Pause the video whilst you have a think.
Did you select A?
Well done, I knew you'd get this right.
The smiley face was used because it represents something positive.
The micro:bit LEDs are one of its outputs.
What other output could the micro:bit use to inform the user of a good or bad movement?
Maybe pause the video here whilst you have a think.
Ah, Jacob's got a great idea, "The micro:bit V2 has a speaker, so it can also play sound.
" So you can see, when I shake this micro:bit, it makes a sound.
(micro:bit whirring) Andeep says, "You could have a happy sound for good.
" True or false?
The LEDs are the most suitable output for this type of application.
Pause the video whilst you have a think.
Did you select false?
Well done.
It might be difficult to see the response on the micro:bit LEDs whilst moving during an activity.
Think about if you're using your micro:bit to check the stride of somebody running.
They're not going to want to keep checking the micro:bit each time they take a step.
The LEDs should be blank if a sound is used instead.
Is this true or false?
Pause the video whilst you have a think.
Did you select false?
Well done.
Some users may be hearing-impaired and could be disadvantaged if the application only plays sound.
Displaying an icon on the LEDs in addition to the sound would make the application more accessible for a wider range of users.
Okay, we're moving on to our first task of today's lesson, and you're doing a fantastic job so far, so well done.
I'd like you to add relevant sound outputs to your application to provide feedback to users on good and bad movements.
You can use blocks from the Music group.
So that's the group that's represented with the red background.
Experiment until you find sounds that work for your users.
Pause the video whilst you complete the task.
To help the user, I can also add sound along with the LED output.
If I click on the Music tab, I can get a series of different sounds.
So, I can drag in play tone.
Let's test that and see what that sounds like.
(medium-pitched tone beeps) Okay, so we can decide, does that sound like a good sound or not?
(low-pitched tone beeps) That sounds potentially like a bad sound.
(high-pitched tone beeps) That might be a good sound, so I'm going to select that.
I can then do something for the bad class but this time change the sound.
(low-pitched tone beeps) So you can now see, I have three different tones for the three different classes.
So, if I change to good, (high-pitched tone beeps) we have the good sound, (low-pitched tone beeps) we have the bad sound, (medium-pitched tone beeps) and we have the null sound.
Did you manage to find some sounds for your good and bad movements?
Did you have to experiment a bit to find sounds that worked with your users?
Some sounds are more appropriate than others.
You may need to think about the meaning users could attach to a sound or its volume in a particular environment.
Okay, we're now moving on to the second part of today's lesson, where we're going to adapt recognition points to refine a programme.
You have created LED and sound outputs for when the model predicts a movement matches a class.
By default, a prediction must reach a certainty level of 80% to cause an input to match a class.
CreateAI refers to this as a recognition point.
80% is the default, but you can change the recognition point.
So if you wanted your model to be more certain, then you can increase it to something like 90 or 95%.
Once a recognition point is reached, the is ML [ ] detected block will return True.
So here's the block.
Is ML, this one is using the good sample, so we have good, detected.
If the recognition point is reached, this block will be True.
You can also use the certainty, in percent, of the model's predictions as a value in your application code.
So this block can be dragged in to different sections of code.
Let's have a look at an example.
You could use these blocks to control programme flow.
You could also use it to control the volume or pitch of a sound or the brightness of the LEDs.
So in this example, we have if is ML good detected, so has a good movement been detected?
If it has, play the happy sound in the background.
But then use the certainty block inside the set volume block.
So, if we're more certain, the volume is going to be louder.
You're now going to use machine learning code blocks.
Experiment by changing your programme to make use of the certainty value of the prediction.
Time to check your understanding.
What controls the playback of the music in this code?
Is it A, the detect block is False, B, the volume value is set using a mathematical formula, or C, the certainty percent that the input matches the good class is higher than the recognition point?
Pause the video whilst you have a think.
Did you select C?
Well done.
The sound is played back when the certainty percentage is higher than the recognition point.
True or false?
You must set the recognition point for the code to work.
Pause the video whilst you have a think.
Did you select false?
Well done.
The default recognition point is 80%, so if you don't set it, it will just default to this value.
Okay, we're moving on to our final set of tasks for today's lesson.
I'd like you to experiment using the machine learning blocks in your code to build your application.
I'd like you to really consider making your device as accessible to a wider range of users as possible.
So think about how you can use both the LEDs and sounds, and also think about the recognition points.
Pause the video whilst you complete the task.
Okay, we've come to the end of today's lesson, and you've done a great job, so well done.
Let's summarise what we have learned in this lesson.
Programme events can be triggered when recognition points are met.
The programme flow can be changed by using blocks that detect if a recognition point has been met, or by using the certainty values.
I hope you've enjoyed today's lesson, and I hope you'll join me again soon, bye.