Social Media’s Approach to Cyberbullying

Child asking for help

Although social media companies indicate concern for the growing problem of cyberbullying, many offer little in the way of concrete protections for kids. Mark Zuckerberg, CEO of Meta, has been quoted as saying that it’s difficult for his social media giant, Facebook, to police cyberbullying content. The same can be said of many other websites when it comes to content moderation, which is more than disconcerting for parents, educators, and lawmakers alike. Below is a list of a few popular social media websites, with links to their policy pages covering cyberbullying and other forms of online harassment:

YouTube

YouTube’s “Harassment & Cyberbullying Policies” page provides users with YouTube’s anti-bullying rules. In general, the platform does not permit content that threatens an individual or targets someone with prolonged or malicious insults based on intrinsic attributes, such as protected group status or physical traits. It provides a reporting process for anyone who believes content appearing on YouTube violates the platform’s anti-bullying policies.

TikTok

TikTok’s extreme popularity has brought it to the forefront of parental attention over the past several years. The platform has an information page on bullying, which defines bullying behavior as involving targeted, repeated efforts to cause “physical, social and/or psychological harm.” The platform states that is does not tolerate any bullying behavior and explains that its desire is for users to feel safe in expressing themselves without the fear of being bullied or harassed. The site’s anti-bullying page offers ways to identify and prevent bullying, as well as how to filter comments in order to hide those offensive ones.

Instagram

Instagram says that it is committed to protecting its users from bullying. The website’s anti-bullying page lists a variety of features to help protect and prevent cyberbullying on the platform, such as issuing “Comment Warnings,” providing “Tag and Mention Controls,” and the ability to report other users if someone notices something that violates Instagram’s “Community Guidelines.”

Facebook/Meta

Facebook states that it has “zero tolerance for any behavior that puts people in danger, whether someone is organizing or advocating real-world violence or bullying other people.” The page directs users to its “Community Standards” page to understand what type of sharing is allowed and what type of content may be reported and potentially removed.

Snapchat

Snapchat has a “Safety Center,” which advises any users who feel they are being bullied or harassed to convey the incident to the platform via an in-app reporting feature. In addition to reporting a bullying incident, Snapchat recommends that children who feel they are the target of a cyberbully share it with their parents or another trusted adult. The platform also provides a way to block those who exhibit bullying behavior. Snapchat states that it has partnered with the Crisis Text Line to provide needed support and/or resources to its U.S. users who are looking for counseling following a cyberbullying event.

Twitch

Having started as a gamer streaming app, Twitch has become one of the more exciting internet sensations for young people and, hence, presents a potential landscape for cyberbullying. The platform’s “Community Guidelines” page includes advice on how to be safe while using the app, as well as what is and is not acceptable behavior by its users. Twitch states that it wants its users to be able to express themselves in a “welcoming and entertaining environment…free of illegal, harmful, and negative interactions.” The platform says it provides a “layered approach” to safety; and in terms of bullying, the app appeals to its users to “mitigate harassing content that appears on their stream or in their chat.”

Discord

As a free communications app, Discord is attractive to kids looking to share voice and video clips, as well as text and chat with friends, gaming communities, and game developers. As a social media platform, Discord faces the same issues with cyberbullying and online harassment as many other platforms. The app’s “Safety Center” explains that Discord is committed to being a “welcoming place for all.”

On the topic of cyberbullying, it states in its Community Guidelines how users should act and that the app does not allow bullying or harassment, in addition to other banned behaviors. Parents are advised to file a detailed report if their child encounters a violation of any of Discord’s guidelines. Reports of prohibited behavior are reviewed by Discord’s “Trust & Safety” team, which “strives to ensure bad users don’t disrupt your teen’s experience” on Discord. The Discord app also provides various features and tools for users to have control over their Discord experience.