Social media giants Facebook and Instagram have announced that they are ramping up efforts to ensure their platforms are kept more safe and inclusive.
The social media platforms said this week that they have put in place more preventative measures and plan to take action on all content that goes against their community standards, which includes harassment and bullying, organised hate, hate speech, suicide and self injury on their platforms.
Instagram was forced to relook at their community standards policy recently as bullying and harassing continued to spike on the application.
The social media giant this week announced that violators would be permanently banned via the account (and related linked info) used for the harassment of others. The social media firm has come under heavy pressure in recent weeks after a welter of cases in which Premier League footballers have received racist abuse.
From now on, anyone found to have sent abusive messages will have their accounts axed rather than suspended.
While Instagram already has sanctions and rules in place for abusive comments, it seems the rules fell into a “grey area” when applied to the direct messages of its users.
Facebook too says they will be clamping down on those who violate their Community Standards policies.
Last week, Facebook released its Community Standards Enforcement Report. During the fourth quarter 2020, Facebook took action on 6.3 million items of bullying and harassment content, 6.4 million pieces of organised hate content, and 2,5 million pieces of suicide and self-injury content.
During the same time, Instagram took action on 5 million pieces of bullying and harassment content, 308 000 pieces of organised hate content, 6,6 million pieces of hate speech content, and 3,4 million pieces of suicide and self-injury content.
A Facebook company spokesperson, who did not want to be named, told the Saturday Star that the social media giants would be coming down hard on those found guilty of violating their community standards policy.
“We have clear rules against hate speech and bullying and harassment on our platforms and take action whenever we find it,” said the spokesperson.
“Hate speech has no place on our platforms. Period. We use a combination of technology and people to find and remove this content, with 15 000 dedicated content reviewers based around the world and sophisticated technology that we’re constantly working to improve.
“Our Community Standards Enforcement Report, which we publish every quarter, shares global data on how we’re doing at enforcing our policies – where we’ve made progress and where we still have to improve, and this is important to hold us accountable.”
While the social media giants were unable to provide exact numbers of how prevalent bullying and harassment, racism and hate speech are in South Africa on social media platforms, they said there has been significant improvement compared to the past few years.
“While we do not provide country-by-country breakdowns or examples, our hate speech policy prohibits attacks against people based on their protected characteristics, including race, religion, nationality, sex, gender identity, sexual orientation, severe disability and disease.
“We also define ‘attack’ as violent or dehumanising speech, statements of inferiority, expressions of contempt or disgust, cursing, and calls for exclusion or segregation. We also strengthened these policies last year to also prohibit harmful stereotypes.”
Those found guilty of violating the community standards policy could lose their account all together.
“When we become aware of content that breaks our rules, we remove the content and let the person who posted it know that they’ve broken our rules.
“We also send them information about our policies to help them avoid sharing more violating content in future. If people continue to break our rules, we take stronger measures, which can include limiting people from using certain features, and ultimately losing their account altogether.”
Posting hate speech on social media platforms like Facebook and Instagram can also land an individual in prison. This all depends on a country’s law, said the Facebook company spokesperson.
“Each country has their own local laws which citizens must abide by. We have invested in technology, processes and people to help us act quickly, so that violations of policies affect as few people as possible.”
Facebook and Instagram say they are working endlessly to ensure they are more efficient at enforcing their community standards.
“Our goal is to get better and more efficient at enforcing our Community Standards. We do this by increasing our use of Artificial Intelligence (AI), by prioritising the content that could cause the most immediate, widespread, and real-world harm, and by coordinating and collaborating with outside experts,” said Kojo Boakye, Director of Public Policy, Africa.