Algorithmic radicalisation
Also known as:
- social media algorithms
- social media feeds
- recommended content
What is Risk ?
Digital risk factors associated with their interests and activities
An algorithm is a set of rules designed to produce an outcome or solve a problem.
On popular social media platforms like Instagram and TikTok, algorithms respond to a user's data to control the content they see. For example, a content algorithm might show more sporting videos to someone if they have watched or liked sports content before. This is intended to keep users engaged and using the platform.
Social media algorithms can be very accurate at predicting what a user likes. This can be based on:
- content watched
- content skipped
- content shared or reposted
- channels or accounts followed
- likes or comments
Small interactions with risky or harmful content, like violent or misogynistic videos, can mean that over time a user gradually sees more and more similar content.
Where this can happen
Risks and motivations
Risks
Self-harm content
Content that promotes self-harm and suicide is illegal. Platforms have a legal responsibility to remove this content. However, a child or young person may see self-harming behaviours in content that hasn’t been removed. For example, videos recommending unhealthy diets.
Repeatedly seeing self-harm content might mean that a child or young person becomes less shocked or frightened by it over time. Self-harm content can be upsetting, distressing, and potentially dangerous if copied.
Extremism and radicalisation
Algorithms might expose a child or young person to extreme content. This could involve content that covers specific religious, political, right-wing, left-wing or single-issue beliefs or views.
Over time, seeing this content could put a child or young person at risk of radicalisation.
Upsetting and dangerous content
A child or young person might be recommended upsetting or dangerous content. For example, video footage of fights or dangerous viral challenges. Exposure to this content can make a child or young person feel frightened or anxious. They might also try to mimic dangerous behaviours.
Information and disinformation
Specific algorithm recommendations can mean that a child or young person is less likely to see different views and perspectives. This is sometimes called a filter bubble. This can impact how a child or young person thinks about the world.
Algorithms can also help spread misinformation and disinformation.
Motivations
A child or young person might engage with content recommended by an algorithm because it:
- involves one of their interests or hobbies
- is funny or entertaining
- is something they agree with
- isn’t something they have seen before
What you can do
You may be worried about the content that a child or young person is viewing or engaging with online.
Talking with them is one way to find out more information as well as some possible harms.
You could speak about:
- motivations for viewing the content
- how it makes them feel
- some risks, for example, disinformation
- speaking to a trusted adult if unsure or worried
In more specific cases it may be helpful to familiarise yourself with some of the signs of radicalisation or signs of self-harm.
If you have any immediate concerns for the safety or wellbeing of a young person contact the emergency services by dialling 999.
If you think that a young person is at risk, follow your safeguarding procedure and read our safeguarding guidance.
Support
Part of a child or young person’s recovery can involve minimising the chances of future harm. It can be helpful to explore reporting or blocking functions on platforms that a child or young person uses. Knowing how to hide or report content can help them feel more in control online.
You could also explore ways of filtering out age-inappropriate or adult content. Filters can be applied to specific devices or home WiFi routers. You can find content restrictions on platforms too. For example, YouTube’s restricted mode.
Recovery may involve advice and support from specialist organisations, for example, The Samaritans, or Prevent.
- How to support young people who encounter upsetting content online (Childnet) – Website
- Understanding self-harm and how to help (The Children's Society) – Website
- Content promoting self-harm, suicide and eating disorders (NSPCC) – Website
- How to deal with misinformation (Internet Matters) – Website
- Protecting children from radicalisation (NSPCC) – Website
Read more about algorithmic radicalisation
- How do people feel about news selected by algorithms on social media (Reuters Institute) – Website
- Children’s experiences of legal but harmful content online (NSPCC) – File
- Research into risk factors that may lead children to harm online (Revealing Reality) – Website
- Exploring the role of the Internet in radicalisation and offending of convicted extremists (HM Prison and Probation Service) – Website
![](https://online-risk-guide.barnardos.org.uk/transforms/_540xAUTO_crop_center-center_none/3239/GettyImages-1405778995-1.webp)
Share your experience of algorithmic radicalisation
You can tell us about:
- other terms you might have heard
- conversations you’ve had with young people
- a related platform or app
- another related risk or harm