KXIP vs KKR Live Score

"One of the questions we're asked most often is how we decide what's allowed on Facebook", said Monika Bickert, Facebook's vice president of global product management.

"First, the guidelines will help people understand where we draw the line on nuanced issues", she said.

But Facebook has also been accused of failing to block hate speech in Myanmar against the Muslim minority - something that United Nations investigators say fuelled ethnic conflict.

Based on this feedback, as well as changes in social norms and language, our standards evolve over time.

However, the underlying principles of safety, voice and equity on which these standards are based have not changed, Facebook emphasised. "We also give people the option to block, unfollow, or hide people and posts, so that they can control their own experience on Facebook".

Facebook announced the publication of their internal documents used by their moderators as well as expanding on a way to appeal decisions made on banned posts. There's no oversight of how Facebook manages its communities, and the content policy that dictates when a post has crossed the line seems to be applied inconsistently, at best. "We also recognize that this is a challenging and sensitive issue", the community standards for the "Integrity and Authenticity" section reads.

She also notes that Facebook's enforcement of its policies are not ideal. "As a first step, today we are launching appeals for posts that were removed for nudity / sexual activity, hate speech or graphic violence". Sometimes, for a small percentage of Facebook terrorist content where users report a profile, Page, or Group - the entire profile, Page or Group is not removed because as a whole they do not violate its policies, but specific contents breaching its standards are removed promptly. Bickert said Facebook's recent privacy travails, which forced CEO Mark Zuckerberg to testify for 10 hours before Congress, didn't prompt their release now. If the decision is overturned, the post will be restored and the user notified.

Being more open about its content moderation strategy and similar processes may earn Facebook more support, yet new details about why and how it removes certain content will also likely open up more opportunities for user scrutiny. Come May, Facebook Forums: Community Standards will debut, a series of public events that will take place in the US, UK, Germany, France, India, Singapore, and other locations. "And then in the community standards, have that be more dynamic in different places".

What are the consequences for someone found to have posted objectionable content on Facebook?

Facebook now has a counter-terrorism team of 200 people, up from 150 in June 2017. Facebook hopes it will give people clarity if posts or videos they report aren't taken down.

Finally, for now at least. "That's why we have developed a set of Community Standards that outline what is and is not allowed on Facebook".

According to the Facebook Newsroom Blog, the company defines terrorism as "Any non-governmental organization that engages in premeditated acts of violence against persons or property to intimidate a civilian population, government, or worldwide organization to achieve a political, religious, or ideological aim".

"If a bike shop comes to Facebook wanting to reach female cyclists in Atlanta, we can show their ad to women in Atlanta who liked a Page about bikes", he writes. "You can click on that icon and get information about who that publisher is or who is behind that speech so that you can make a more informed choice", she said.

This large definition adopted by Facebook is inclusive of religious extremists and violent separatists to white supremacists and militant environmental groups and many more such violent-minded goals.

All interesting and no doubt useful for the average Facebook user to know.

Since the Cambridge Analytica data scandal, Facebook has faced its most tumultuous period in history.


COMMENTS