As lawmakers and advocacy groups call for more child and teen safety online, tech giants are under pressure to provide more protection for minors.
In their latest effort to address the issue, Google last week unveiled a policy to increase control of minors over access to their images through the platform’s search tool.
Kids and teens, along with their parents and guardians, can request removal, in a few online steps, of their images from search results.
As a result, these images won’t appear in the images tab or as thumbnails in any feature in a Google search, said the company in a blog post.
They added that removing an image from Google results doesn’t remove it from the internet and that people should contact a site’s webmaster to ask that they remove the content, too.
“We believe this change will help give young people more control over their digital footprint and where their images can be found on Search,” Google said.
In October, Facebook, another tech giant, announced it would be doing more to protect kids on its platforms, including Instagram and What’s App, from harm and bullying.
The added safety measures also extended to protect politicians and public figures from unwanted harassment, particularly those that are sexual.
“We do not allow bullying and harassment on our platform, but when it does happen, we act,” Facebook said in a statement. “We also regularly pressure test these policies with our safety experts, making changes as needed.” Eliminating mass-coordinated harassment is another one of Facebook’s new policies. They added this would protect individuals at heightened risk of offline harm, such as victims of violent tragedies or government dissidents.
But this policy also applies to everyday individuals, who can become victims of targeted harassment via direct messages in their inbox or comments on personal profiles or posts.
These changes come as Facebook faces a firestorm, including charges that its platforms threaten its users’ mental health, particularly minors.
That criticism surfaced recently when former employee, now whistleblower Frances Haugen, told a Senate subcommittee on Oct. 5 about research that shows the platform and its companies like Instagram psychologically harms teens.
The same day the National Center on Sexual Exploitation (NCOSE) called on Congress to find solutions to hold Facebook and other tech companies accountable for protecting children, following testimony from Haugen.
“Time and again, Big Tech has proven that it cannot regulate itself, and therefore Congress must step in,” NCOSE said. “Otherwise, predators and pimps will continue to take advantage of the dangerous tech ecosystems to identify and exploit teens and children.”