Is This Tech's Big Tobacco Moment?
Instagram and YouTube Go On Trial For Harming Youth
Hi Readers,
I spend time this week at the Common Sense Summit on Kids and Families in San Francisco. The panel I moderated, “Doing Better For Boys,” was terrific, and it was great to meet several Teen Health Today readers in the crowd!
In today’s newsletter, I’m sharing reflections on one of the big questions being discussed at that conference: “Have tech companies intentionally caused harm to our youth, and should they have to pay for that harm?” I promise it will give you a lot to think about.
Big love,
Christopher
A quick request: If you know anyone in Charlottesville, Virginia, please let them know them that I’ll be doing a “Talk To Your Boys” discussion and book signing at St. Anne’s-Belfield School on March 31. It's free and open to the public. RSVP here.
Big Tech On Trial
One of the most talked-about panels at the Common Sense Summit was “Is This Tech’s Big Tobacco Moment?," which featured California Attorney General Rob Bonta (AG Rob Bonta) and New Mexico Attorney General Raúl Torrez in conversation with Dylan Byers.
At the time of this panel, both AGs were waiting on results in cases they filed against tech companies. The verdicts in New Mexico and California have since come in, and they are generating a lot of reaction. Here’s what Common Sense Media’s Jim Steyer had to say about the California decision:
“This verdict is a powerful recognition of what Common Sense Media and families across the country have known for years: Social media companies deliberately design their platforms to keep kids hooked, consequences to their mental and physical health be damned. The momentum for change is no longer building. It’s here.
Social media giants would never have faced trial if they had prioritized kids’ safety over engagement. Instead, they buried their own research showing children were being harmed, and used kids and society as guinea pigs in massive, uncontrolled, and wildly profitable experiments. Now, executives are being held to account.
This verdict, along with other recent court rulings, should embolden lawmakers in California and across the country to use their authority to force real change in how these companies design and operate their products. We must keep pushing, advancing, and enforcing stronger laws for social media and AI youth safety.”
To add some context to this news, I’m sharing an essay below from a trusted expert on technology and youth, Devorah Heitner, PhD, the author of Growing Up in Public: Coming of Age in a Digital World.
SPRING SALE!
Get full access to every issue of Teen Health Today, plus our
full archives, for only $3.50 a month.
Instagram And YouTube On Trial
By Devorah Heitner, PhD
Two big verdicts were announced today, one in New Mexico and another in a Los Angeles Court.
The New Mexico case argued (and won) that the very design of Instagram is harmful to users. While the monetary judgement will barely touch a huge company like Meta, the focus of the argument could be a game-changer. New Mexico v. Meta Platforms centers on how features like endless scroll and algorithmic recommendations—can contribute to harm, particularly for teens.
For years, companies like Meta Platforms have been protected by Section 230, a law from the 1990s that says platforms aren’t legally responsible for what users post. Within that legal framework, social media has been legally treated more like a neutral bulletin board than a newspaper—hosting content rather than curating or vetting it.
What makes the New Mexico case such a big deal is the argument that the platform itself—its structure, its incentives, its algorithms—was designed in a way that creates foreseeable harm. So the app is found to NOT be a neutral bulletin board, but a bulletin board that prioritizes more harmful messages.
This legal reasoning will impact cases moving forward in other states, many of them asking the same question: Were these platforms designed in ways that push users—especially minors—toward harmful experiences?
Unlike the New Mexico case, the Los Angeles case—took a different approach, arguing YouTube, Snap Inc., and Meta’s apps caused specific mental health harm to specific individuals. In this case: Snap and TikTok settled before trial and only Meta and Google went to trial. The jury didn’t fully agree on causation. It is tough to isolate any single cause of mental health challenges. When we look at the kids we care about--when they suffer, it’s rarely just due to one thing. Yet the results will still affect these companies going forward.
Taken together, these trials are a great big warning signal to these massive, powerful companies. People are getting more and more skeptical and mistrustful of these apps.
What Might Change
If these legal approaches continues to gain traction, we may begin to see shifts in how platforms operate, especially for younger users: More friction in feeds versus endless scrolling, changes to recommendation systems and possibly stronger default safety in direct messaging and other settings.
Unfortunately, companies and parent activists may lean into age-verification instead of focusing on holding the companies accountable to the experience of ALL users. While stricter age-gating may seem like a solution to the harms demonstrated in these trials, it creates new problems for user privacy and the right to speech and assembly for kids.
It’s worth noting: companies may choose to make changes only for minors to limit liability. But from an ethical standpoint, the design of these platforms affects users of all ages. Shouldn’t safer, more intentional design should be the standard for everyone.
A Big Tobacco Moment?
When I hear that social media is having a “big tobacco” moment, the analogy fits in some ways (quashing internal research about harms...) Yet the substance is different--Tobacco is always harmful. Social media, on the other hand, can genuinely support connection, affinity and learning. Many of us have seen that upside in our own lives.
For me, the most helpful part of that analogy is the cultural shift de-centering tobacco. Even with the rise of vaping, overall nicotine use is still significantly lower than it was when I was growing up in the 1980s.
It makes me wonder: if more people come to feel that social media is doing more harm than good, will they begin to step away?
Many adults—and some teens—are becoming more intentional. They’re reducing their usage, setting boundaries, or gravitating toward platforms that feel more positive and less draining. There’s a quiet recalibration happening.
Right now, it’s hard to imagine a full-scale shift away. These platforms are deeply woven into how we communicate, find affinity and connect with one another. But things could change a lot if we decide to prioritize other ways to connect and share information.
Talking About It
For parents and educators: ask young people if they have heard about these legal cases. You could discuss:
How do these platforms make us feel—and why?
What does healthy use look like?
What would we want them to be, if we could redesign them?
What responsibility do designers have for user behavior?
How do algorithms shape what we see—and how we feel?
Where is the line between engagement and manipulation?
These cases don’t settle those questions—but they do make them harder to ignore.
In the meantime, we want to model healthy use of social apps and also show kids that we can step away when they don’t serve us.
Learn more about Devorah Heitner, PhD’s work - including how to buy her books and bring her in as a speaker - at devorahheitner.com.
Recent Teen Health Today Highlights
What to do after watching “Louis Theroux: Inside The Manosphere”
How To Talk About Clavicular, Looksmaxxing, And Loneliness
Helping young people make sense of what they are seeing online
“This Is Very, Very, Very Worrying,”
An alarming new study on teen cannabis use and later psychosis. PLUS: Teens want more male vulnerability onscreen, and experts question tech in classrooms.
If you enjoyed this post, please forward it to someone else who might like it, and click the ❤️ or 🔁 button on this post so more people can discover it on Substack. 🙏🏼






