Speaking at the Facebook Asia-Pacific (APAC) Safety Press Briefing webinar, Amber Hawkes, Head of Safety Policy for Facebook in APAC, said that Facebook will continue to approach using four major points in addressing social media safety:
The social media app regularly engages with over 500 safety partners around the world. The partners help Facebook by informing to Facebook about the partners’ work and collaborating to deliver programmes.
“In Malaysia, Facebook has partnered with Mental Illness Awareness and Support Association (MIASA) Malaysia. A non-governmental organization (NGO) that provides counselling, assessment and various support to people experiencing mental well-being issues.
A safety advisory board has been established by Facebook in the Asia-Pacific (APAC) to provide inputs on Facebook’s efforts in terms of security and safety,” Hawkes explained.
Facebook’s community standards outline what is and not allowed on the social media app. The community standards cover areas like;
· Violent and Criminal Behaviour.
· Suicide and Self Injury.
· Child Sexual Exploitation.
· Sexual Exploitation of Adults (including the non-consensual sharing of intimate images, and sextortion).
· Adult Nudity and Sexual Activity.
· Bullying and Harassment.
· Human Exploitation (including human trafficking).
· Privacy Violations.
· Hate Speech, Violent and Graphic Content.
· Cruel and Insensitive Content.
· Misrepresentation (using an inauthentic identity).
Facebook’s content reviewers typically review more than two million pieces of content every day in dozens of languages and the ability to review in regional dialects in many countries.
“A network of local civil society partners around the world who can reach out to Facebook via dedicated channels to alert the social media app to emerging issues and provide essential context” she added.
Security check-up will help users to get alerts when someone tries logging in an account from an unrecognised computer or mobile device, learn how to protect your password and enable two-factor authentication.
Privacy check-up will help users to look at who can see what the users share, how to keep the users account secure, how other people can find other users on Facebook and users’ data setting on the social media app.
Wellbeing tools on Facebook that allows the users to view how much time spent on the app and introduced the “You’re All Caught up” feature to let users know that every post from the last two days have been seen.
Remove follower features allows public or private accounts to remove people from the followers’ list at any time.
Restrict is a feature that, once enabled, comments on the users’ post from a person that has been restricted will only be visible to the person that has been restricted.
Filter comments offer users to manage the comments on Instagram. Users can turn off comments entirely, delete and report abusive comments, or block certain people from commenting.
Safety Notifications and Reporting is when a group is reported to the WhatsApp security team and gets the last five messages as part of the report.
If the user confirms “report and block”, the content of the message is transmitted in plaintext to WhatsApp so the platform can analyse it.
The team has a “Contact Us” channel found in the Settings menu, and a dedicated global team that reviews and acts on these reports.
WhatsApp are also researching ways to encourage people to report. For instance, at the time someone exits or groups or deletes a contact, issuing a pop up to ask if they would like to make a report.
Labelling and Forward Limits are introduced to give people essential context when users receive a message that has been shared multiple times. WhatsApp also reduced the number of people; users can forward a message to just five chats at once.
Highly forwarded messages are limited further and can only be forwarded to one chat. This has resulted in a 70 percent reduction in the number of highly forwarded messages on WhatsApp.
Addressing Abuse is a feature that identifies and bans accounts with abnormal behaviour patterns and now banning two million accounts per month. WhatsApp can proactively detect 75 percent without relying on users’ reports.
· Facebook Safety Centre
In the Facebook Safety Centre, sections dedicated to parents, youth, online wellbeing and bullying. The team is continuing to add other specialised sections, such as non-consensual sharing of intimate images.
“Also, in the Facebook Safety Centre, users can download a variety of guides in different languages on key safety topics including women’s safety, LGBTQI safety, senior’s safety, school safety and more,”
“All these guides are produced in partnership with third-party experts, and have been designed for use by individuals, organisations, educators, and caregivers,” said APAC’s Head of Safety Policy for Facebook.
· Instagram Safety Centre
In the Instagram Safety Centre, we have dedicated sections on online bullying, a guide for parents and details on the various programmes we lead and support on safety.
Facebook spent US$3.7 billion on safety and security in 2019 and it was more than the application’s entire revenue at the time of the Initial Public Offering (IPO).