Loading...
Hello, my name is Miss Willow, and I'm going to be your teacher for today's lesson.
Today's lesson is called What Happens After Reporting a Concern? And it fits into the unit Our Online Lives.
How Do I Report and Find Support for Things I See Online? During this lesson, we're going to be talking a little bit about peer pressure and bullying, so we recommend that you have an adult with you for the duration of this lesson.
If at any point you do feel worried or uncomfortable, it's really important that you close the screen and that you go and speak to a trusted adult.
Okay, let's make a start.
By the end of today's lesson, you'll be able to explain what happens after reporting online content and how your actions contribute to a safer online community.
Before we get started with today's lesson, we need to go over some ground rules.
These help to make sure that everyone feels safe and comfortable throughout today's lesson.
Laura says that we need to listen to others.
"It's okay to disagree with each other, but we should listen properly before we make any assumptions or before we decide how to respond.
When we disagree with someone else, it's important to challenge the statement and not the person themselves." Andeep says, we need to respect each other's privacy.
"We can discuss examples, but we shouldn't use any names or descriptions that could identify anyone, including ourselves." If we want to share a story or an experience, we can refer to someone as my friend.
This means we're not going to give any information away that could identify someone.
Izzy says that we can choose our level of participation.
"Everyone has the right to choose not to answer a question or to join in with discussion, and we should never put anyone on the spot," as this can make people feel uncomfortable.
And finally, Jacob reminds us that we need to not have any judgement.
"We can explore our beliefs and any misunderstandings about a topic without fear of being judged." We're now going to have a look at the keywords for today's lesson.
These are gonna come up quite a few times in today's lesson, so it's really important that we know what each of these words mean.
First of all, we have community guidelines.
In this context, these are rules set by online platforms to keep users safe, and enforcement, in this context, actions taken to remove or restrict harmful content or users.
As we go through today's lesson, keep an eye out for these keywords and when you spot them, see if you can remember what they mean.
Today's lesson is split into two learning cycles.
Our first learning cycle is called what do platforms do when i report something? And our second learning cycle is called why might reports not always lead to action? Let's make a start on our first learning cycle.
What do platforms do when I report something? Community guidelines are the rules that websites and apps set to protect users and to make sure that online content is respectful and safe.
Different platforms have different community guidelines, but they often cover topics like hate speech, which can be discriminatory, threats, nudity, bullying, and other harmful behaviour.
Understanding these community guidelines helps you to decide what should be reported.
If you see something that breaks these community guidelines, it's important to report it to the platform that you've seen it on.
When reporting, you typically need to identify which specific community guideline you think has been violated.
You can report various types of content, including posts, images, videos, or comments.
You can also report a user's account if they're impersonating another person or consistently posting harmful content.
Let's do a check for understanding to see how you're doing so far.
Can you remember what are community guidelines? A, a list of usernames, B, rules to keep people safe online, or C, a school's behaviour policy.
What are community guidelines? Talk to the people around you or have a think to yourself.
Well done if you said that B is correct.
Community guidelines are rules to keep people safe online.
Well done if you said the same thing.
If the content breaks the community guidelines, the platform will take enforcement action.
This might include removing the content, giving a warning to the user who posted it or restricting the account through a temporary suspension.
This means that they're unable to use the app or website for a certain amount of time.
They could also be permanently banned from the account, which means they're unable to use that platform again.
If you don't see something change immediately after reporting it, that doesn't mean that your report didn't help.
Reports build patterns and may lead to action later on.
For example, if multiple different people report the same content or the same account, the platform are able to build up some evidence and they may take enforcement action against the post or user later on.
Therefore, it's always better to report something than not.
Let's do another check for understanding.
Which of these are enforcement actions that a platform might take after a report, A, removing the content, B, giving the user a badge, or C, banning or suspending the account? What do you think? Pause the video, talk to the people around you or have a think to yourself.
Well done if you said that A and C are correct.
Some enforcement actions that a platform might take after a report could be removing the content from the platform or banning or suspending the account.
This means that they're unable to use the platform anymore.
Well done if you said the same thing.
Websites and apps try to keep people safe, but they don't always get it right or issue enforcement actions fast enough.
That's why it's also important to take other steps to protect yourself, such as taking screenshots of harmful content as a way to document evidence.
If you are not sure how to take a screenshot on your device, you can ask a trusted adult for help.
You can also block or mute the person responsible for posting the content to reduce your chances of seeing more similar content.
When you block someone online, they're unable to contact you or see your content, and if you mute someone, you don't see their content in your feed, but they might not be able to see that you've unfollowed them.
You can also tell a trusted adult, like a parent or teacher, about what you have seen, as they can help support you.
Remember that a trusted adult is any adult that we know offline that we trust to keep us safe.
Police Officer Kofi says, "Remember, online platforms don't usually notify users if someone has blocked them and most of the time reports stay anonymous, so the person whose post you reported won't know who's made the report." Let's do another check for understanding, but this time I'd like you to decide if this statement is true or false.
You only need to report harmful content to the online platform, then it will be resolved.
Pause the video and have a think to yourself or talk to the people around you.
Do you think this is true or false? Well done if you said that this is false, but why? You might have said that whilst reporting harmful content is worthwhile, it doesn't always lead to immediate action.
Therefore, it's good to also block or mute the person who posted the content and to tell a trusted adult what's happened.
Well done if you said this or something similar.
What happens once a report is made? Number one, once you click Report and select the reason for your complaint, for example, the community guidelines that have been violated, your report is automatically sent to the platform's moderation team.
You'll usually receive a confirmation message that your report has been received.
Next, the platform's automated systems may do a quick initial review of the content that you've reported.
Some obvious violations like certain types of spam or clearly inappropriate images might be removed immediately at this stage by the automated system.
Number three, if the content isn't automatically removed by the automated system, human moderators for the platform will examine your report more carefully.
They will look at the reported content in detail, they'll check it against the platform's community guidelines, and remember that every platform might have slightly different community guidelines.
They will consider the context of the post or comment, so what's happening around it and why was it posted, and they'll also review any previous by that user to see if this is a frequent event and if it happens regularly.
Next, the moderation team would decide on and carry out one of several enforcement actions if the content is deemed to break community guidelines.
Number five, the platform will send you a notification about their decision, usually within a few hours or days.
This might tell you whether enforcement action was taken and what type of action was taken.
Those specific details may be limited and you may not be told everything that's happened.
Six, if you disagree with the platform's decision, most online platforms allow you to submit an appeal, explaining why you think that they made the wrong decision.
You can also provide additional context or evidence if you feel that it's relevant and appropriate.
You can also request a second review of the reported content.
Let's do another check for understanding.
Can you correct the order of the reporting process? Number one, initial review, number two, your report is submitted, number three, decision making, number four, you receive feedback, number five, human moderation review, and six, potential appeals process.
Can you correct the order of this reporting process? Pause the video, talk to the people around you or have a think to yourself.
Okay, let's see what your reporting process should look like.
You now should have the order, two, your report is submitted, one, initial review, five, human moderation review, three, decision making, four, you receive feedback, and six, potential appeals process.
Well done if your reporting process looks like this.
You've done a brilliant job so far, well done.
It's now time to move on to our practise task.
For this task, I'd like you to create a flow chart to show what online platforms do when you report something online.
Pause the video, and in a few minutes, we'll go through what you might have said.
Okay, your flowchart should look a little bit like this.
First we have your report is submitted, next, there's an initial review and a decision is made about whether the community guidelines have been violated and if enforcement action needs to be taken.
After the initial review, there can also be a human moderation review.
Here, a decision can be made also about enforcement action.
Next, you'll receive feedback and have the opportunity to appeal if you wish to.
Well done if your flow chart looks a little bit like this, it is time to move on to our second learning cycle and you've done a brilliant job so far, well done.
Our next learning cycle is called why might reports not always lead to action? Police Officer Kofi tells us that, "Not all reports lead to immediate results.
This might be because the platform doesn't think that the content breaks their rules or their community guidelines, or they might need more time to review it or because the report needs more evidence.
That doesn't mean that your report was ignored." Some online platforms may use automated systems to sort through large numbers of reports.
These systems aren't perfect.
They might miss harmful content or allow something that breaks the rules to stay online.
Human moderators can also help, but they can make mistakes too or they can take time to respond.
Let's do a check for understanding to see how you're doing with this learning cycle.
Why might a report not lead to action straight away, A, because nothing ever happens when you report something, so there's no point, B, because the platform is ignoring you as they get so many reports a day, C, because more evidence may be needed to determine if it violated community guidelines, or D, because the platforms automated system has categorised it incorrectly? Why might a report not lead to action straight away? Talk to the people around you or have a think to yourself.
Well done if you said that C and D are correct.
Sometimes a report may not lead to action straight away because more evidence may be needed to determine if it violated community guidelines or because the platform's automated system has categorised it incorrectly.
Well done if you got this correct.
Sophia is watching short videos on a popular platform.
She sees a video where a group of people make sexist jokes, saying that girls aren't as smart as boys and they shouldn't be in leadership roles.
Sophia knows that this is wrong and she reports the video for breaking the platform's community guidelines.
A few days later, the video is still up.
She feels frustrated and she wonders if her report made any difference, as it doesn't seem like any enforcement actions have been taken.
But Sophia's report is still really important.
Sometimes platforms take time to review reports, especially if the issue is harder to judge or it's not always obvious.
It's possible that the platform is still investigating or waiting to see if others report it too.
Even though no action has been taken yet, Sophia still did the right thing.
Reporting content that sends a message, helps create a record and can lead to action later on, especially if other people report the same thing or the user continues to post inappropriate or harmful content.
In the meantime, there are still things that Sophia can do.
She can block the account so that she doesn't see any more videos from that user, she could choose a different platform where she feels more comfortable and safe, and she could talk to a trusted adult about what she saw and how it made her feel.
Trusted adults can support us if we need support with something that we've seen online.
All of these steps help Sophia to stay safe and supported, even when the platform hasn't taken any action yet.
Let's do another check for understanding.
I'd like you to list three things that Sophia can do alongside reporting the content to the platform.
Pause the video, talk to the people around you or have a think to yourself.
You might have said, number one, she could block the account, she could two, use a different platform, or three, she could tell a trusted adult for support.
Well done if you said these three things.
It's time to move on to your final practise task.
Well done for your brilliant hard work so far.
I'd like you to read the scenario and answer the following questions.
Jack saw a post that included mean comments about a celebrity and thought that it was unkind and hurtful.
He reported it to the platform, but the post itself stayed up.
Number one, what should Jack do next to protect himself and respond safely? And number two, why is it still the right decision for Jack to report the post? pause the video and we'll go through some potential answers in a few minutes.
Okay, let's see what you might have said.
For question number one, what should Jack do next to protect himself and respond safely? You might have said something like, Jack could block the account that posted the mean comments so he doesn't see any more upsetting content.
He could also take a screenshot in case he needs to report it again or show it to an adult or someone else that he trusts.
Most importantly, Jack should talk to a trusted adult, like a parent or teacher, so he gets support and advice on what to do next.
For the second question, why is it still the right decision for Jack to report the post? You might have said that even though the post wasn't removed, reporting it helped create a record for the online platform.
Furthermore, if other people also report the same account or content, the platform might take action in the future.
Jack's report shows that the content is harmful and helps the platform to learn what users are concerned about.
Reporting also shows Jack made a positive choice to keep the online space safer.
Well done if you said something similar to this.
We're now going to summarise the key learning from today's lesson.
In today's lesson, we have learned that reports are reviewed through an investigation, usually by moderators or automated systems. Platforms use community guidelines to decide what content is harmful.
Enforcement actions include removing content or banning users, but platforms don't always get it right.
Reporting harmful content still matters as it helps keep online spaces safer for everyone.
And finally, we've learned if your report doesn't lead to action, you can still protect yourself by blocking the account, taking screenshots, and talking to a trusted adult like a parent, carer or a teacher at school.
Well done for your hard work in today's lesson.
In today's lesson, you might have found you've got some worries or some questions, and if you do, it's really important that you seek support from a trusted adult.
There's also some resources on the screen that are there to help you too.
Well done for your hard work today.
I hope to see you in another lesson soon.