top of page
Writer's pictureYuki Klotz-Burwell

From Likes to Dislikes: Are social media platforms doing enough to protect our mental health?


The virtual environments we spend so much of our lives in might seem to offer experiences limited to our online world, but platforms like Instagram, TikTok, and LinkedIn can have tangible effects on our real-world lives—especially regarding mental health.


Earlier this month, the C.D.C. released data showing nearly three in five teenage girls felt persistent sadness in 2021, the highest level reported over the past decade. The intense increase of reported mental health challenges has a direct relationship between teen social media and technology use, as stated by psychiatrist Dr. Victor Fornari in The New York Times.


With overwhelming evidence highlighting the problems that come with social media, platforms have started acknowledging that their products might have adverse mental effects. In an interview with The Wall Street Journal, Instagram head Adam Mosseri noted that some of the site's features could be harmful, especially to younger users. But as social media apps gain more followers, are platforms really doing enough to address these issues?


Recent product launches


Earlier this year, Instagram launched Quiet Mode, a new feature aimed at helping users manage their time spent on the app. In Quiet Mode, all notifications are paused, and anyone who sends a direct message is notified that the user has "Quiet Mode" activated.


While these tools can help manage the negative impact of social media use, they require users to be proactive and intentional. Sophie Janicke-Bowles, Ph.D., a positive media psychologist and assistant professor at Chapman University, says that these self-regulation tools place the onus on the user to regulate their own well-being and effectively allow companies to bear less responsibility for the impact their products have on users' mental health.


Instead, Janicke-Bowels believes companies should build their platforms toward human values and away from drawing attention to drive advertising.


"Companies should actually design technology not with a drip-feed of continuous dopamine hits, but instead as tools that the consumer can shape in the way that suits them best," she says. "The technology should be human-centered, rather than money and attention-sucking-centered."


On a micro-level, platforms could adjust the attention-driving tools they already have to reduce screen time.


"Companies can include natural stopping cues so that after 10 minutes of browsing, no new content appears," says Janicke-Bowles. "Instead of pushing for immediate notifications, send them out once or twice a day to limit unnecessarily interrupting people's days."


Platforms fight back


According to Janicke-Bowles, social media is deliberately designed to impact our psychology and inherently encourages comparison and competition.


"Social comparison is something we do naturally, within and outside of social media, but the problem is that now consumers spend considerable time on social media," she says. "Social media is designed to keep us hooked on the platforms, and now consumers are spending a considerable amount on social media and artificially increasing the time spent on social comparison."


It's a problem that's especially prevalent in younger generations, like teenage girls, and platforms like Instagram and TikTok recognize this. At the same time as the Quiet Mode announcement, Instagram also rolled out parental supervision tools specifically for parents to monitor their teens' Instagram settings.


While these features may seem like helpful tools to combat mental health concerns, social media companies aren't being entirely altruistic. In fact, these platforms are constantly fighting research-backed accusations and lawsuits about their products' impact on youth mental health.


In early 2023, Seattle's public school system filed a lawsuit against the companies operating TikTok, Instagram, Facebook, Snapchat, and YouTube for the negative impact on student mental health.


The lawsuit alleges that the social media platforms' products have "been a substantial factor in causing a youth mental health crisis, which has been marked by higher and higher proportions of youth struggling with anxiety, depression, thoughts of self-harm, and suicidal ideation."


Other social media companies didn't publicly respond to the lawsuit, but Meta's Head of Global Safety said the company has "more than 30 tools to support teens and families." Both Quiet Mode and the extra parental supervision features were announced less than two weeks after Seattle Public Schools filed the lawsuit.


Looking ahead


Existing settings like screen time management and parental supervision are helpful, according to Janicke-Bowles, but explicit stopping cues or other features are necessary for users to gain self-control.


"We know that tools that help with self-regulation, such as time monitoring features, are effective in reducing anxiety by giving consumers back a level of self-control that is beneficial for their mental health," she says.


While platforms are slowly introducing more features for users to restrict their own social media usage, it's not enough to simply place the burden of responsibility solely on the individual user. Companies need to do more to take accountability for their platforms' impact on mental health and invest in developing and implementing more powerful tools.

Comments


bottom of page