Courtesy photo
Courtesy photo

Facebook, the social media giant that also owns Instagram and WhatsApp, recently announced a slew of policy changes to make its platform safer for children, minimize harassment to public figures and thwart online bullying.

The tech company said it will remove harmful content that attacks public figures and provide more protections for those who have become famous involuntarily — like human rights defenders and journalists.

“We do not allow bullying and harassment on our platform, but when it does happen, we act,” Facebook said in a statement. “We also regularly pressure test these policies with our safety experts, making changes as needed.”

Eliminating mass-coordinated harassment is another one of Facebook’s new policies, they added, this will protect individuals at heightened risk of offline harm, for example, victims of violent tragedies or government dissidents.

But this policy also applies to everyday individuals, who can become victims of targeted harassment via direct messages in their inbox or comments on personal profiles or posts.

Facebook said they will require additional information or context to enforce this new policy.

While Facebook doesn’t want to limit discourse on public figures, the company said they are trying to strike a balance between protecting them from abuse and open dialogue.

“Public figures shouldn’t be subjected to degrading or sexualized attacks,” Facebook said.

Adding they currently remove attacks on public figures that encompass a wide range of harms, but they will now also remove things that include: Derogatory, sexualized photoshopped images and drawings; Profiles, pages, groups or events dedicated to sexualizing the public figure; and degrading content depicting individuals in the process of bodily functions.

These changes come as Facebook faces a firestorm, including charges that its platforms threaten its users’ mental health, particularly minors.

That criticism surfaced recently when former employee, now whistleblower Frances Haugen, told a Senate subcommittee on Oct. 5 about research that shows the platform, and its companies like Instagram psychologically harm teens.

In late September, the company announced plans to pause  ‘Instagram Kids,’ the photo-sharing app that would be exclusively for Instagram users under the age of 13. Adding that they’ll use this time to work with parents, experts and policymakers to “demonstrate the value and need for this product.”

“We started this project to address an important problem seen across our industry: kids are getting phones younger and younger, misrepresenting their age, and downloading apps that are meant for those 13 or older,” Facebook said. “Critics of ‘Instagram Kids’ will see this as an acknowledgment that the project is a bad idea. That’s not the case,” the company said. “The reality is that kids are already online, and we believe that developing age-appropriate experiences designed specifically for them is far better for parents than where we are today.”

Facebook is also addressing the issue of negative self-image for teens that comes from using its products.

They announced in September two ideas they’re exploring: encouraging people to look at other topics if they’re dwelling on content that might contribute to negative social comparison, and a feature tentatively called “Take a Break,” where people could put their account on pause and take a moment to consider whether the time they’re spending is meaningful.

“I hear the concerns with this project, and we’re announcing these steps today so we can get it right,” Adam Mosseri, who leads Instagram said.

Sarafina Wright –Washington Informer Staff Writer

Sarafina Wright is a staff writer at the Washington Informer where she covers business, community events, education, health and politics. She also serves as the editor-in-chief of the WI Bridge, the Informer’s...

Leave a comment

Your email address will not be published.