In a landmark case, a jury discovered this week that Meta and YouTube negligently designed their platforms and harmed the plaintiff, a 20-year-old girl known as Kaley G.M. The jury agreed with the plaintiff that social media is addictive and dangerous and was intentionally designed to be that approach. This discovering aligns with my view as a medical psychologist: that social media addiction shouldn’t be a failure of customers, however a characteristic of the platforms themselves. I imagine that accountability should lengthen past people to the programs and incentives that form their habits.
In my medical observe, I commonly see sufferers scuffling with compulsive social media use. Many describe a sample of “doomscrolling,” typically utilizing social media to numb themselves after a protracted day. Afterwards, they really feel responsible and burdened concerning the time misplaced but have had restricted success altering this sample on their very own.
It’s simple to grasp why scrolling could be so addictive. Social media interfaces are constructed round a strong behavioral mechanism generally known as intermittent reinforcement, says Judson Brewer, an habit researcher at Brown University, which is the strongest and only sort of reinforcement learning. This is similar mechanism that slot machines depend on: Customers by no means know when the following reward—a bathe of quarters, or a slew of likes and feedback—will seem. Not all of the movies in our feeds captivate us, but when we scroll lengthy sufficient, we’re certain to reach at one which does. The continued seek for rewards ensnares us and reinforces itself.
Why Social Media Feels Addictive
People usually wrestle on their very own to deal with compulsive social media use. This ought to be no shock, as habits usually are not usually damaged by way of sheer self-discipline however somewhat by altering the reinforcement loops that maintain them. Brewer argues that “there’s really no neuroscientific proof for the presence of willpower.” Inserting the burden to self-regulate solely on customers misses the deeper concern: These platforms are engineered to override particular person management.
A growing body of research identifies social media use and fixed digital connectivity as essential influences on the rising incidence of adolescent mental health issues. Brewer notes that adolescents are significantly susceptible, as they’re in a “developmental section” wherein reinforcement studying processes are particularly robust. This vulnerability could be exploited by the design options of enormous social media platforms.
How Platforms Are Designed to Maximize Engagement
NPR uncovered records from a current lawsuit filed by Kentucky’s legal professional normal towards TikTok. In response to these paperwork, TikTok carried out interface mechanisms reminiscent of autoplay, infinite scrolling, and a extremely personalised suggestion algorithm that have been systematically optimized to maximise consumer engagement.
TikTok’s algorithmically tailor-made “For You” content material repeatedly tracks consumer behaviors, reminiscent of how lengthy a video is watched, whether or not it’s replayed, or shortly skipped. The feed then curates brief movies, or reels, for the consumer based mostly on previous scrolling habits and what’s almost definitely to carry consideration.
These paperwork present one instance of a tech firm knowingly designing merchandise to maximise consideration. I imagine social media corporations even have the capability to cut back addictiveness by way of intentional design decisions.
How Governments Are Regulating Social Media
The excellent news is we aren’t helpless. There are a number of levers for change: how we collectively speak about social media, how our governments regulate its design and entry, and the way we maintain corporations accountable for practices that form consumer habits.
Some international locations are transferring shortly to set coverage round social media use. Australia has imposed a minimal age of 16 for social media accounts, with comparable bans pending in Denmark, France, and Malaysia.
These bans usually depend on age verification. Customers with out verified accounts can nonetheless passively watch movies on platforms like YouTube, however this method removes most of the most addictive options, together with infinite scroll, personalised feeds, notifications, and programs for followers and likes. On the identical time, age verification may cause different problems within the on-line ecosystem.
Different international locations are focusing on social media use in particular contexts. South Korea, for instance, banned smartphone use in classrooms. And the United Kingdom is taking a special method; its Age Appropriate Design Code instructs platforms to prioritize youngsters’s security whereas designing merchandise. The code consists of robust privateness defaults, limits on data collection, and constraints on options that nudge customers towards larger engagement.
How Social Media Platforms May Be Redesigned
A report referred to as Breaking the Algorithm, from Psychological Well being America, argues that social media platforms ought to shift from maximizing engagement to supporting well-being. It requires revamping recommendation systems to identify patterns of unhealthy use and adjusting feeds accordingly—for instance, by limiting excessive or distressing content material.
The report additionally argues that customers shouldn’t must deliberately choose out of dangerous design options. As a substitute, the most secure settings ought to be the default. The report helps regulatory measures aimed toward limiting options reminiscent of autoplay and infinite scroll whereas imposing privateness and security settings.
Platforms may additionally give customers extra management by including pure pace bumps, reminiscent of stopping factors or break reminders throughout scrolling. Research reveals that interrupting infinite scroll with prompts reminiscent of “Do you need to hold going?” considerably reduces senseless scrolling and improves reminiscence of content material.
Some social media platforms are already experimenting with extra moral engagement. Mastodon, an open-source, decentralized platform, shows posts chronologically somewhat than rating them for engagement, and doesn’t provide algorithmically generated feeds like “For You.” Bluesky offers customers management by letting them customise their very own algorithms and toggle between totally different feed varieties, reminiscent of chronological or topic-based filters.
In mild of the current verdict, it’s time for a nationwide dialog about accountability for social media corporations. Particular person duty will at all times be essential, however so are the mechanisms employed by big tech to form consumer habits. If social media platforms are at present designed to seize consideration, they can be designed to offer a few of it again.
From Your Web site Articles
Associated Articles Across the Net

