Facebook commits to tackle child exploitation issue and protect vulnerable communities on platform

unsplash.com

Speaking at the Facebook Asia-Pacific (APAC) Safety Press Briefing webinar, Amber Hawkes, Head of Safety Policy for Facebook in APAC, said that Facebook will continue to mark down more child exploitation and protect vulnerable communities in the social media app.

Amber Hawkes said that Facebook designed the platform to give people control over their own experiences including control over what they share, who they share it with, the content they see, and who can contact them.

The team built in a number of additional protections to keep them safe like;

· Advertising categories for teens are more limited than for people over 18 years of age.

· New accounts belonging to minors on Facebook, are automatically defaulted to share with ‘friends’ only and their default audience options for posts do not include ‘public.’ If a minor wants to share publicly, they must go to their settings to enable the option, and we remind them about the meaning of posting publicly.

·  The team keeps face recognition off for minors.

·  Facebook limits who can see or search specific information teens have shared, such as contact information, school, hometown, or birthday.

·  Messages sent to minors from adults who are not friends (or friends of the minor’s friends) are filtered out of the minor’s inbox.

·  The team takes steps to remind minors that they should only accept friend requests from people they know.

·  Location sharing is off by default for minors. When either an adult or minor turn on location sharing, the team includes a consistent indicator as a reminder that they are sharing their location.

The standards provide additional protections for minors in areas such as bullying, harassment and image privacy.

Under their Privacy policy, the platform will remove images or videos of minors under 13 years old when the content is reported by the minor, or parent or legal guardian, and the team have dedicated reporting channels for this content.

Facebook also removes images or videos of a minor between 13 and 18 years old when the content is reported by the minor themselves.

The team also complies with legal guardian requests for removal of attacks on unintentionally famous minors.

“Facebook’s effort to keep people safe is especially important in places where social media can be used to spread disinformation and hate at scale and amplify existing social tensions. This is something the team are very aware of and focused on as a company.

In early 2018, the team established a dedicated, multi-disciplinary team to better understand and address the way social media is used in countries experiencing conflict. The people on this team have spent their careers studying issues like misinformation, hate speech, and polarisation,” said Hawkes.

Facebook has also worked alongside local groups for input on products, and programmes.

Previous articleMCO 2.0 Costs The Country RM600 Million Daily
Next articleMastercard launches one stop center microsite to help SMEs digitalise and recover

LEAVE A REPLY

Please enter your comment!
Please enter your name here