Psychology experts are urging social media giants to increase transparency around algorithms to protect users' mental health

Psychology experts are urging social media giants to increase transparency around algorithms to protect users’ mental health

In a new article published in the magazine body shapeIn this study, a team of psychology researchers has outlined a mountain of evidence linking social media use to body image issues. The researchers describe how algorithms can intensify this association and prod social media companies to take action.

Appearance-based social media platforms like TikTok seem to be particularly harmful to users’ body image. On these platforms, teens are constantly exposed to filtered and edited content that presents unrealistic body standards. According to recent evidence, this distorted environment increases users’ risk of body dissatisfaction and adverse conditions such as body malformation and eating disorders.

“I am interested in risk and protective factors for body image, and some of my recent research has focused on the role of social media,” explained lead author Jennifer A. Harriger, professor of psychology at Pepperdine University. “I became interested in the use of algorithms by social media companies, whistleblowers exposed wrongdoing that shows companies were aware of the harm their platforms are causing to young users. This article was written as a call to social media companies, researchers, influencers, parents, educators and clinicians. We need to do better. to protect our youth.

In their report, Harriger and her team explained that these effects may be exacerbated by social media algorithms that personalize the content displayed to users. These algorithms “rabbit-hole” users in more extreme and less censored content and are designed to keep them on the platform.

Importantly, the harm caused by these algorithms is unknown to social media companies, as evidenced by recent whistleblower testimonials. Frances Haugen, the former CEO of Facebook, leaked documents revealing that the social media giant was aware of research linking its products to mental health and body image problems among teens. Later, a whistleblower from TikTok leaked evidence of an algorithm that carefully manipulates the content displayed to users, prioritizing emotional content in order to keep them engaged.

Social media platforms can be valuable opportunities to connect with others, and users have the ability to customize their own experiences (choosing which content to follow or interact with); But social media platforms also have drawbacks. One such drawback is the company’s use of algorithms designed to keep the user engaged for longer periods of time,” Harriger told PsyPost.

Social media companies are aware of the harm caused by their platforms and their use of algorithms but have not made efforts to protect users. Until these companies become more transparent about the use of their algorithms and provide opportunities for users to opt out of content they do not wish to view, users are at risk. One way to reduce risk is to only follow accounts that have positive effects on mental and physical health and block provocative or negative content.”

In their article, Harriger and colleagues outline recommendations to combat these algorithms and protect the mental health of social media users. First, they assert that the main responsibility lies with the social media companies themselves. The authors echo suggestions made by the Academy of Eating Disorders (AED), noting that social media companies should increase the transparency of their algorithms, take steps to remove accounts sharing disruptive content, and make their research data more accessible to the public.

The researchers add that social media platforms should disclose to users why they choose the content they see in their feeds. They should also limit micro-targeting, which is a marketing strategy that targets specific users based on their personal data. Furthermore, these companies are socially responsible for the well-being of their users and should take steps to raise awareness of weight stigma. This can be done by consulting body image experts and eating disorder experts on ways to encourage positive body image among users, possibly by promoting body positive content on the platform.

Then, influencers can also play a role in influencing their followers’ body image and well-being. Harriger and colleagues suggest that influencers should also consult body image experts for guidance on body positive messaging. Positive actions may include informing the public about social media algorithms and encouraging them to combat the negative effects of the algorithms by following and interacting with body positive content.

Researchers, educators, and clinicians can study ways to prevent the negative impact of social media on body image. “It is difficult to conduct empirical research on the impact of algorithms, because each user’s experience is personally tailored to their interests (eg, what they have clicked or viewed in the past),” Harriger noted. “However, research could examine the use of media literacy programs that address the role of algorithms and provide young users with tools to protect their well-being while they use social media.”

This research can help inform social media literacy programs that teach teens about advertising on social media, encourage them to use critical thinking when engaging in social media, and teach them strategies to increase the positive content displayed in their feeds.

Parents can teach their children positive social media habits by modeling healthy behavior with their electronic devices and setting rules and limits around their children’s social media use. They can also host discussions with their children on issues such as photo editing on social media and algorithms.

Overall, the researchers concluded that social media companies have the ultimate responsibility to protect the well-being of their users. “We emphasize that system-wide change must occur so that individual users can effectively fulfill their role in maintaining their body image and well-being,” the researchers report. “Social media companies need to be transparent about how they deliver content if algorithms continue to be used, and need to provide users with clear ways to easily opt out of content they don’t want to see.”

the study, “The dangers of the rabbit hole: Reflections on social media as a gateway to a distorted world of modified bodies, the dangers of eating disorders, and the role of algorithms.‘, written by Jennifer A. Harriger, Joshua A. Evans, J. Kevin Thompson, and Tracy L. Telka.

#Psychology #experts #urging #social #media #giants #increase #transparency #algorithms #protect #users #mental #health

Leave a Comment

Your email address will not be published.