I had just finished Netflix’s hit series Adolescence, and I couldn’t shake the pit in my stomach.
The final scene played over in my mind—not just because of its emotional weight, but because of how real it felt. Jamie Miller, a 13-year-old boy, commits an unthinkable act after descending into a world of online radicalization, echoing headlines that are becoming disturbingly familiar. What haunted me most wasn’t the violence—it was how algorithm-driven platforms helped lead him there. Adolescence doesn’t just tell a story; it’s a chilling mirror held up to our digital society, exposing how algorithms and social media can quietly distort identities, values, and even morality itself.
The Algorithm as a Silent Architect
How many times have you mentioned something at dinner—a new skincare brand, an obscure vacation spot, or that oddly specific pair of retro sneakers—only for it to show up on your Instagram feed the next morning? You didn’t search it. You didn’t even type it. And yet, there it is, staring back like your phone eavesdropped on your soul.
It didn’t read your mind. But the algorithm came disturbingly close.
Platforms like TikTok, Instagram, and YouTube collect thousands of data points: how long you linger on a video, what you save, who you follow, what your friends like—even ambient data, like audio cues from your mic. This behavioral profile is then fed into machine learning systems like collaborative filtering and deep neural networks. These algorithms aren’t just recommending what you like—they’re showing you what will keep you scrolling, regardless of whether it’s good for you.
This is where the danger begins.
In Adolescence, Jamie’s radicalization doesn’t happen overnight. It’s slow, almost invisible—driven by algorithms that prioritize engagement over ethics. One video leads to another, each more extreme than the last. The line between curiosity and coercion blurs until what he’s consuming isn’t information anymore—it’s indoctrination.

From Gym Reels to Misogyny: How Algorithms Funnel Teens into the Manosphere
In today’s digital landscape, a casual scroll through motivational gym videos or self-improvement content can swiftly lead users into the darker realms of the internet. As depicted in Netflix’s Adolescence, there’s a subtle yet perilous algorithmic trajectory: one moment, viewers engage with content on discipline or fitness; the next, they’re exposed to figures like Andrew Tate, who “red pilled” the audience into championing toxic ideologies under the guise of “self-improvement.”
This isn’t just speculation. A 2024 study by University College London found that TikTok’s algorithm can expose teens to misogynistic content within five days of engaging with general male self-help content. What starts as fitness advice quickly mutates into toxic masculinity.
And this trajectory isn’t just anecdotal—it’s part of a wider phenomenon: digital extremism. According to a report by Ipsos, algorithms amplify divisive content by design, creating what they call “radicalization pathways.” These filter bubbles reinforce biases, feed outrage, and polarize users into isolated belief systems—especially dangerous for teens still forming their sense of self.
This feedback loop doesn’t just mirror what we want—it shapes what we want, often without our awareness.
More Than a Mirror: A Masterclass in Algorithmic Education
This is what makes Adolescence so critical. It’s not just a show—it’s a syllabus. It lays bare the invisible influence of algorithmic recommendation systems and how they manipulate attention for profit. Through Jamie’s story, it teaches:
- Surveillance capitalism (coined by Shoshana Zuboff): the monetization of personal data at the cost of personal well-being.
- Reinforcement learning: the AI behind algorithms constantly refining what keeps you hooked.
- Filter bubbles and algorithmic bias: systems that feed you content reinforcing your worldview—sometimes pushing you toward extremism without you even realizing it.
The result? A generation navigating adolescence while being shaped—emotionally, cognitively, and politically—by forces they can’t see.
Beyond the Screen: Real-World Consequences
Adolescence is already making an impact. Following its release, conversations around digital safety surged. In the UK, it aligned with the rollout of the Online Safety Act, while global think tanks like the World Economic Forum continue pushing for ethical digital governance.
Because let’s be clear: the dystopia isn’t theoretical. It’s already here. Internal research from Meta showed Instagram harms teenage girls’ mental health. TikTok has been caught serving pro-eating disorder content to vulnerable users. YouTube’s algorithm has been studied for its role in political radicalization.
The platforms know. And still, the algorithm rolls on—fueled by our attention, no matter the cost.
Can We Escape the Loop?
Adolescence doesn’t just expose the problem—it hints at solutions. Characters question the system. Some choose to unplug. Others resist the dopamine-fueled scroll. And that’s the key.
Real resistance will take more than regulation. It requires:
- Algorithm transparency – Platforms must explain how their systems work.
- Media literacy education – So teens can identify manipulative content.
- Ethical tech design – Prioritizing user well-being over engagement metrics.
But the most powerful resistance might be personal: choosing intention over impulse. Choosing to understand rather than to passively consume.
The Final Scroll
In a sea of trending dramas, Adolescence stands out not just for its emotional depth, but for its uncomfortable truth: your feed isn’t just feeding you content—it’s shaping your worldview. The algorithm is not your friend. It’s not neutral. It’s not just curating your experience. It’s curating you.
So the next time you find yourself mindlessly scrolling, ask: Is this what I want to see? Or is this what the algorithm wants me to become?
Because that’s the real cliffhanger. And it’s not just on Netflix.

Leave a comment