What Happens When Teens Privately Ask for Help on Instagram?
- Dragons on Fire: Kudos for Student Achievement: Fall 2024
- Children Exposed to Antiseizure Meds During Pregnancy Face Neurodevelopmental Risks, Drexel Study Finds
- Standardized Autism Screening During Pediatric Well Visits Identified More, Younger Children with High Likelihood for Autism Diagnosis
- Reporting Into the Void: Research Suggests Companies Fall Short When It Comes to Addressing Phishing
Revelations and research over the last few years have shed light on how Instagram may negatively affect its youngest users. The most popular social media platform among 13- to 21-year-olds in America, Instagram was designed to connect people with shared interests. However, recent research has pointed to the use of social media as possibly contributing to a rise in mental health and eating disorders among teenage girls. Researchers at Drexel University and Vanderbilt University are trying to figure out exactly what young users are experiencing on Instagram, in hopes of curtailing the negative trend and getting them the support they need.
In a paper recently published by the Association of Computing Machinery, the team of researchers from Drexel’s College of Computing & Informatics and Vanderbilt School of Engineering laid out the findings from their analysis of Instagram direct messages between users who asked for help in their conversations on the app.
While this age group is known to use Instagram the most, their paper, entitled “’Help Me:’ Examining Youth’s Private Pleas for Support and the Responses Received from Peers via Instagram Direct Messages,” published in the Association for Computing Machinery’s Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, is the first study looking at user-contributed private messages among teens to better understand the dynamics of how they ask for support from their peers on Instagram.
“Due to logistical challenges and privacy concerns there has been very little research on how this age group interacts in private online conversations — particularly to exchange support,” said Afsaneh Razi, PhD, an assistant professor in Drexel’s College of Computing & Informatics, who a was a co-author of the paper. “This study is the first of its kind to provide insight about these exchanges and lay out a framework for how the platform could offer support for users who are suffering.”
The team looked specifically at how youth initiate peer support conversations in private messages, the topics for which they’re seeking support, and the types of support they received.
Their findings suggest that young people on Instagram are more likely to share negative experiences — ranging from everyday stress to severe mental health issues — in private messages with friends and acquaintances they met online. Most disclosures were met with positive peer support; though the team also found specific sets of circumstances that led to support being denied.
The dataset analyzed was donated by 189 volunteer participants, ages 13-21, who each shared at least three months of their Instagram data from when they were between 13 and 17 years old. Each submission included direct message exchanges with at least 15 people, with at least two messages that made them or someone else feel uncomfortable.
In total, the researchers collected 7 million messages. They sifted them down to 82 relevant conversations that included the words “help me,” comprising more than 336,000 individual messages. They found that most disclosures asking for help fell into four broad categories: mental health concerns, relationship issues, daily life issues, and abuse.
“Our study is one of the first to unpack how youth use social media platforms to privately seek support about serious issues. As researchers and designers, that means we need to account for this behavior in the design of social media platforms as well as improve education and awareness on equipping youth how to handle these types of hard conversations in a safe way,” said Pamela Wisniewski, PhD, an associate professor in Vanderbilt University’s Department of Computer Science.
Mental health concerns came up in 61% of the conversations they analyzed. These ranged from emotional challenges and social anxieties, to riskier conversations about eating disorders and thoughts of suicide. Relationships issues — among family, friends and romantic partners — were discussed in 41% of the conversations; daily life issues — work, school, health and financial problems — in 27%; and abuse, including harassment, bullying and violence, was disclosed in 11%.
Few conversations started out with someone seeking support — 94% began as casual conversations, 74% as one-on-one chats, and evolved into a disclosure and an ask for help. They found that about half of them were between friends, and about a quarter between acquaintances who had only met online, none of the conversations included family members.
But in almost all cases, support was offered in some form — from offering emotional support or kind words to build up the individual’s self-esteem, to offering helpful information, tangible assistance, or a connection to support networks.
“We saw that in a large majority of conversations, when someone made a difficult disclosure, others in the conversation offered support by attempting to relate or empathize with their situation by sharing a similar experience, and providing information that could be helpful,” said Jina Huh-Yoo, PhD, an assistant professor in Drexel’s College of Computing & Informatics, who helped to lead the research. “These mutual disclosures seemed to establish trust, even among people who had just met on the platform.”
The team also made a unique observation about the few times that someone did not offer support in a help-seeking conversation. These occurred in just a handful or fewer instances in each conversational category, but they followed a pattern that could be important, the researchers contend.
“We noticed that support was denied in instances when a participant in the conversation did not feel they were in a mental or emotional position to help or when there was a perceived imbalance in mutuality of support,” Razi said. “While this is a small subset of conversations, it represents a set of behaviors that has not been identified in previous research, so we believe it is a significant area for future research.”
Continuing to study teens’ interactions on social media platforms, such as Instagram, could provide relevant insight into the increase in depression and self-harm that has emerged in recent years. Instagram is a particularly useful portal for this work, they suggest, because it invites informal and wide-ranging discussions that are not primarily focused on seeking help, which helps to avoid any perceived stigma associated with formal help-seeking activities such as counseling or therapy.
In addition, the public-facing side of Instagram serves as a conversation starter based on mutual interests. This allows teens to grow a support network that includes more than just their offline friends — and is not primarily focused on sharing problems or seeking support. In many cases, the team observed, conversations between two people who have a relationship offline allowed for a more authentic interaction, free from performative behavior or hindered by entangled relationships with others.
What this all means, the team suggests, is that Instagram interactions present a valuable opportunity to integrate automated technology for providing support. They note that protecting the privacy of users and the integrity of conversations on the platform are paramount for preserving it as a channel where teens feel safe sharing their problems and asking for help, but tools could be created that would automatically and subtly provide resources and guidance within a conversation.
“Based on this finding, automatic support from conversational agents powered by Artificial Intelligence could help conversation members to know how balanced they are in exchanging support and when they would need extra support based on the frequency of exchanges and direction of the changes,” they write. “The automatic agents could help conversation members with guidance to additional resources when the other person refuses to support or provides unsupportive comments.”
While these findings are drawn from a small subset of Instagram users, who voluntarily participated in the research, the researchers suggest that their model for studying teen interactions on Instagram, and the tool they created to securely collect the data, is an entryway for expanding this important line of inquiry. The team notes that future research could look at how users’ demographic background — age, sex, race, or gender, for example — affects how they seek support on social media. It could also expand to look at how help-seeking behavior differs between social media platforms.
This research was supported by the U.S. National Science Foundation. In addition to Wisniewski, Huh-Yoo and Razi, Diep N. Nguyen, and Sampada Regmi, from Drexel, contributed to the research. Read the full paper here: https://doi.org/10.1145/3544548.3581233
Drexel News is produced by
University Marketing and Communications.