Still in the Room · Cleo A. Solberg
Social media and kids' mental health. TikTok algorithms. Screen time. The evidence — and what parents can actually do.
Read Still in the Room →Teen mental health declined sharply after 2012 — the year smartphone ownership crossed 50% among American teenagers. Rates of depression, anxiety, and self-harm rose across every Western country, faster among girls than boys. The scientific debate is about the size of the effect; the direction is not contested. The mechanism: social comparison at scale, delivered algorithmically, at the exact developmental moment when adolescent identity is most fragile. A thirteen-year-old doesn't just compare herself to her class anymore — she compares herself to a curated highlight reel of hundreds of peers, algorithmically selected to maximize emotional activation.
TikTok is not a social network like Facebook. It is primarily a recommendation engine that happens to include social features. Its algorithm optimizes for engagement — time on platform — not for your child's wellbeing. It learns, within a few sessions, what emotional triggers are most effective for each specific user. For adolescents whose social identity is still forming, this means content that produces inadequacy, aspiration, and the fear of missing out. Not because the algorithm is malicious — because engagement optimization on the adolescent brain produces exactly those feelings, and the algorithm discovered this.
Sleep disruption mediates a significant portion of social media's mental health effects — this finding is replicated across multiple studies. A phone in the bedroom at night is a different risk than a phone used in the living room. Moving phones out of bedrooms is the single parental intervention with the strongest and most consistent evidence base. It is also practically the most achievable. Before addressing screen time in general, app restrictions, or content filtering — start here.
The research on adolescent development is consistent: teenagers who feel surveilled by parents who don't trust them develop secretive behavior — the opposite of what monitoring was designed to prevent. Parental monitoring apps may produce visibility without producing safety. The parents who navigate this best are those who maintain relationship quality: present without being intrusive, authoritative rather than authoritarian. The goal isn't knowing everything your child is doing online. It's being someone they'll come to when something goes wrong.
Children who understand that algorithms optimize for their engagement — not their happiness — engage with platforms differently. This is one of the most actionable findings in digital literacy research. Teaching a child to ask 'why did this appear in my feed?' and 'what is this designed to make me feel?' builds a protective cognitive frame. It doesn't make them immune. But it puts them in a fundamentally different relationship with the platform than children who don't know the system exists.
Still in the Room
The complete guide — stage by stage, from first device to teenage independence.
Get on Amazon →Related essays