Facebook Archives - WITNESS https://www.witness.org/tag/facebook/ Human Rights Video Tue, 14 Dec 2021 19:08:12 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.2 76151064 WITNESS and the Facebook Trump Suspension https://www.witness.org/witness-facebook-trump-suspension/ Fri, 12 Feb 2021 15:27:49 +0000 https://www.witness.org/?p=2258850 WITNESS submitted the following comment to the Facebook Oversight Board on their consideration of the suspension of former President Trump from Facebook and Instagram.

For further discussion on these issues see our recent post: Truth, Lies and Social Media Accountability in 2021: A WITNESS Perspective on Key Priorities

Summary

WITNESS, an international human rights organization helping people use video and technology notes:

*All of our comments are in the light of the fact that the powers to push Facebook on policy change, on product/technical infrastructure change, on global resourcing and on Facebook’s response to extralegal political pressures globally have not been granted to Oversight Board. 

*Public figures need greater scrutiny, not less. Account suspension was correct.

*Public interest exceptions should apply to vulnerable speakers, not those in power with speech options

*Preservation of critical speech and content can be achieved via evidence lockers

*Off-platform context and dangerous speech principles are critical to making decision, not optional

*Facebook’s rules are not clear to ordinary people:  they suffer from inconsistency in application, bias and lack of appeal. 

*Global enforcement requires: far greater contextual understanding, including beyond majority elites as well as resourcing to moderation/for civil society globally and support to content moderation workers. It requires insulation from domestic extralegal pressures that compromise Facebook in countries around world.

Our Submission

WITNESS (witness.org) is an international human rights organization that helps people use video and technology to promote and defend human rights notes. We work with human rights organizations, social movements and individual witnesses in over 100 countries who engage in human rights-based activity on Facebook’s platforms, and who face threats from abuse of Facebook’s platforms. 

Below we address questions raised by the Oversight Board.

However we first emphasize that to create an equitable, transparent and human rights-centered approach to content moderation requires power that has not been granted to the Oversight Board. To fully confront these questions requires from Facebook: a) A commitment to changes in overall policy b) Direct input from this decision-making into both product development and underlying technical infrastructure including algorithms c) A far more significant human and technical resourcing of/and attention to countries outside the US and Europe and to the needs, demands and harms to vulnerable populations in those countries and the US and Europe d) A concerted effort to insulate country-level Facebook staff and country-level decision-making from political influence and illegitimate government pressure.

 The OB asks: If Facebook’s decision to suspend President Trump’s accounts for an indefinite period complied with the company’s responsibilities to respect freedom of expression and human rights, if alternative measures should have been taken, and what measures should be taken for these accounts going forward:  More often than not, world leaders who incite violence and hatred online (and share harmful misinformation and disinformation) get away with it for too long. Human rights activists have consistently documented this in a range of global contexts, noting situations involving leaders in the USA, Brazil, India, and the Philippines. A decision to suspend former President Trump’s account is too late, not too early, as it was with other world leaders – e.g  Senior General Min Aung Hlaing in Myanmar.  However, a clear, consistent, transparent process for providing warnings, for appropriately applying earlier temporary account suspensions or content removals, and ultimately for permanently suspending accounts — all with right of appeal — is important.  

The OB asks: How Facebook should treat the expression of political candidates, office holders, and former office holders, considering their varying positions of power, the importance of political opposition, and the public’s right to information: Facebook’s explicit provision of a newsworthiness for all politicians’ speech has provided cover for leaders to share false information or incite hate and for Facebook to act inconsistently. When it comes to incitement to hate, or sharing of harmful misinformation (for example on COVID-19), leaders should be subject to greater scrutiny when they push boundaries on platforms, not less.  Newsworthiness exceptions and related public interest protections for posts or speakers do have a place… in protecting critical evidence of rights violations and vulnerable speakers within the public sphere rather than leaders who have other options for public speech, and who have generally been given ‘the benefit of the doubt’. Considerations of protecting important information from an archival perspective can be fulfilled by preserving content that has been shared on the platform but not making it public via evidence lockers

The OB asks: How Facebook should assess off-Facebook context in enforcing its Community Standards, particularly where Facebook seeks to determine whether content may incite violence: Facebook should assess off-platform context if the genuine purpose of intervention is to prevent violence rather than provide policy loopholes for politicians to jump through, and  if the company is legitimately trying to enforce standards in accordance with human rights standards. This off-platform context provides information to help ascertain and be clear on the real-world impact of online speech, and whether this impact justifies curtailing that speech. This must be complemented with real-world resourcing and responsiveness to civil society globally, particularly of groups vulnerable to dangerous speech from a politician. The Dangerous Speech project provides excellent guidance on this approach.

The OB asks: The accessibility of Facebook’s rules for account-level enforcement (e.g. disabling accounts or account functions) and appeals against that enforcement: Facebook’s rules are not clear for ordinary people. For a decade WITNESS’s partners and human rights defenders around the world have complained about take-downs of accounts and content without clarity or with apparent bias. Facebook should be transparent about how decisions are made for both for leaders and ordinary users, and hew to human rights principles of proportionality, legitimacy and specificity rather than over-broad, inconsistent deplatforming. The Santa Clara Principles for content moderation and the recommendations of Professor David Kaye, the former UN Special Rapporteur on Freedom of Expression and Opinion in 2018 provide clear roadmaps, generally accepted within the human rights community for how to do this.

The OB asks: Considerations for the consistent global enforcement of Facebook’s content policies against political leaders, whether at the content-level (e.g. content removal) or account-level (e.g. disabling account functions), including the relevance of Facebook’s “newsworthiness” exemption and Facebook’s human rights responsibilities:  Consistent global enforcement is essential. This must be adequately resourced and with worker protections for vulnerable content moderation workers subject to trauma (see work of Professor Sarah Roberts). It must be done with a clear understanding of language and cultural context that is informed not only by majority elites in countries, but also by diversity and representation of historically marginalized communities in particular countries. Facebook must quickly act when policy decisions in particular countries are impacted by domestic political pressures outside of law or platform rules. Facebook must invest more money in content moderation and more resources in supporting global civil society advocates and entities who act as watchdogs. Otherwise rules will be applied consistently and reinforce trends to US and European exceptionalism in terms of content policy.

A newsworthiness exception should be far more applicable to protecting critical evidence of rights violations and vulnerable speakers within the public sphere, rather than leaders who have other options for public speech, and who have generally been given ‘benefit of the doubt’. Considerations of protecting important information from an archival perspective can be fulfilled by preserving content that has been shared on the platform but not making it public; these “evidence lockers” provide access to critical information for accountability purposes under privacy-preserving conditions. 

]]>
2258850
Press release: Major Human Rights and Internet Watchdog Organizations Sign On to Demands for #AuditFBIndia As Pressure Builds https://www.witness.org/press-release-major-human-rights-and-internet-watchdog-organizations-sign-on-to-demands-for-auditfbindia-as-pressure-builds/ Wed, 09 Sep 2020 10:30:47 +0000 https://www.witness.org/?p=2223642 9 September 2020

Contact:

Dia Kayyali, dia@witness.org,

Heidi Beirich, heidi@globalextremism.org

Global and national groups call on Mark Zuckerberg to hold Facebook India accountable

In early August, the Wall Street Journal published an exposé of Facebook India, with evidence that Ankhi Das, the head of Public Policy at Facebook, India, South and Central exhibited political bias by suspending the community guidelines when it came to genocidal hate speech. The article has been followed by myriad press reports in the Wall Street Journal, Reuters, TIME Magazine, and more, detailing bias and failure to address dangerous content at the Facebook India office. This week, a wide range of civil society organizations from around the world, including WITNESS, Free Press, Global Project Against Hate and Extremism, Media Justice,  Southern Poverty Law Center, and the Islamic Women’s Council of New Zealand have signed an open letter calling on Mark Zuckerberg to work with civil society to address dangerous content on Facebook, ensure that a thorough audit of Facebook India takes place, and place Ankhi Das on administrative leave while the audit is being conducted. 

The timeline of Facebook’s complicity in genocide goes back to 2013, where hate speech on the platform fuelled the Muzzafarnagar riots. It has continued unabated. In 2020,  content like #Coronajihad has spread as quickly as COVID itself, and has led to real world violence against many. Facebook itself admitted that it was used to incite genocide against Rohingya Muslims in Myanmar. An Indian parliamentary panel on information technology questioned Facebook on September 2 and will do so again, following demands made by ex-civil servants and Facebook India employees. But Facebook shouldn’t wait to be forced to take action. It should publicly state what steps it is taking to address its tragic failures in India. 

Here’s what civil society is saying:

Anjum Raman, project Lead for Inclusive Aotearoa Collective (New Zealand) and member of the Global Internet Forum to Counter Terrorism Independent Advisory Committee, said:

The activities and actions of Facebook staff in India show the danger of political parties and tech companies colluding to undermine the democratic process.  It shows the result of not having independent, transparent regulatory systems in place to oversee the activities of companies which have significant impact on the wellbeing and safety of millions of people.  We have already seen evidence of that harm across the world, whether in Myanmar, India, or New Zealand.  The international community needs to come together to ensure urgent action in regulating the behaviour and activities of these companies.

Dia Kayyali, Program Manager, tech + advocacy at WITNESS, said:

Facebook has been warned about the offline violence enabled by its platform in places like India. From the United Nations Special Adviser on the Prevention of Genocide to civil society organizations, the alarm has been raised. So why is dangerous Islamophobic content that is directly linked to real world harm being allowed to persist? Even before the Wall Street Journal article, it seemed to many of us in the global community that it’s about profit in an important market- but Facebook’s business model can’t be reliant on ignoring warning signs of genocide.. Facebook must take real action now, not just apologize or even get rid of one or two executives. This is your chance, Facebook. Conduct the most thorough investigation you’ve done yet. Make the results as public as possible. And work with civil society to stem the flow of the bloody, harmful content on your platform. 

Heidi Beirich, Ph.D., EVP  at the Global Project Against Hate and Extremism, said:

It’s high time Mark Zuckerberg and Facebook take anti-Muslim hatred seriously and change how its policies are applied in Asia and across the world. The scandal in the Indian office, where anti-Muslim and other forms of hatred were allowed to stay online due to religious and political bias, is appalling and the leadership in that office complicit. Hatred fomented on Facebook has led to violence and the most terrifying crime, genocide, against Muslims and other marginalized populations across the region, most notably in Myanmar. Anti-Muslim content is metastasizing across the platform as Facebook’s own civil rights audit proved. Facebook must put an end to this now.

Sana Din from Indian American Muslim Council said:

 Facebook allowed incendiary Islamophobic content even after they were informed that it was leading to genocidal violence.  From Muzzafnagar to Delhi, Indian Muslims and millions of other caste oppressed minorities cannot wait for change.  Facebook needs to act now. They  cannot evade their direct role in supporting genocidal hate speech at Facebook India and the only remedy to this harm is an Audit of caste and religious bias. 

Steven Renderos, executive director of MediaJustice said: 

As a global company with over 3 billion followers, Facebook has the unprecedented power of affecting users both on and off their platform. Counter to Mark Zuckerberg’s public aspirations of creating an inclusive platform, Facebook has become the tool of choice around the world to escalate violence around race, caste, and religion. As recent history in Myanmar has taught us, the consequences of not preventing hate speech from going viral on their platform translate into actual violence and genocide for some. This is not merely about a company struggling to address hateful activities at scale, this is a result of people Facebook has entrusted to represent its interest around the world. Nowhere is this more true than with Ankhi Das and Facebook India.

You can read the letter here.

]]>
2223642
Program Manager Dia Kayyali’s advice for Mark Zuckerberg featured in The Guardian https://www.witness.org/program-manager-dia-kayyalis-advice-for-mark-zuckerberg-featured-in-the-guardian/ Mon, 14 Jan 2019 21:27:57 +0000 https://www.witness.org/?p=2195939 Every January since 2009, Mark Zuckerberg, CEO of Facebook, publicly shares his goals for the year. Over the past decade, as Facebook has grown in influence and notoriety, his “personal challenges” have mirrored the weight and responsibility of the tech giant. A far cry from earlier declarations like promising to dress more adult-like, Zuckerberg’s resolutions have become far more consequential–not just for himself and his company, but all of us. In 2018, in the wake of security issues, misinformation, election scandals, and more, Zuckerberg pledged “to focus on fixing these important issues.” Many believe 2018 to be the first year he failed to accomplish his personal challenge.

However, ahead of this year’s formal declaration of his commitments, The Guardian asked technology experts, policymakers, and activists two questions:

  • What do you predict Mark Zuckerberg’s 2019 personal challenge will be?
  • What do you think Mark Zuckerberg’s 2019 personal challenge should be?

WITNESS’ Tech Advocacy Program Manager, Dia Kayyali, was one of the experts asked to predict and advise. Here’s what they had to say:

Will be: Some other, similarly broad, challenge that relates to making Facebook a force for good in the world.

Should be: Take personal responsibility for turning Facebook around as a company. That means publicly committing to creating an ethical and principled company that respects civil society, and ensuring that at every level Facebook makes decisions based on human rights instead of market forces. It means personally committing to a Facebook that doesn’t accidentally make decisions that aid violent regimes, white supremacists and other bad actors. Above all, it means simply being honest about Facebook’s largely detrimental role in global society. That would be the biggest challenge of all.

Shortly after The Guardian ran this piece, Zuckerberg shared his 2019 personal challenge. Following another terrible year for Facebook, Zuckerberg pledged “….to host a series of public discussions about the future of technology in society — the opportunities, the challenges, the hopes, and the anxieties.”

Unfortunately, Dia’s prediction was pretty spot on.

Dia leads WITNESS’ Tech Advocacy program which engages technology companies and supports digital policies that help human rights advocates safely, effectively, and ethically use technology for good. The program includes direct, sustained advocacy to those in leadership positions at companies to ensure that anyone, anywhere can use the power of technology to protect and defend human rights.

]]>
2195939
Delete Facebook? Not just yet. https://www.witness.org/delete-facebook-not-just-yet/ Tue, 03 Apr 2018 22:32:30 +0000 https://www.witness.org/?p=2193773 By Dia Kayyali, WITNESS Program Manager, Tech+Advocacy

In light of the scandal about Facebook during the last few weeks, we at WITNESS think it’s important to make something clear: Facebook has responsibilities to its users, and ultimately all those affected by its product. Facebook has responsibilities around how it handles user data and user content. It’s time for Facebook to be more transparent, to provide more due process, and give users more control. It’s time for Facebook to uphold human rights. It’s not time for WITNESS to delete Facebook—yet. But it’s clear something needs to change.

In the wake of the Cambridge Analytica scandal, and as the changes to European Union privacy law known as General Data Protection Regulation (GDPR) approach, it’s good to see Facebook’s irresponsible data practices finally getting the public scrutiny they deserve. But it’s equally essential that the role of Facebook as a forum for expression not be forgotten. For many people around the world, Facebook isn’t just the primary way that they connect and share information, it is THE INTERNET, and Facebook seems intent on expanding its territory as quickly as possible, constantly acquiring other companies like WhatsApp and launching new products. Facebook content, from propaganda videos to posts from citizen journalists,  is increasingly shaping the world. Telling users to delete Facebook ignores the fact that doing so is a privilege that not all people can afford.

That’s why WITNESS continues to push for a better Facebook through our Tech + Advocacy program. As always when we see an issue that affects the people we work with, we’ve got some recommendations for Facebook: 

What needs to change? More transparency and more (due) process

At the most fundamental level, Facebook needs to be more transparent about data practices and content regulation. When it comes to sharing user data,  Facebook needs to start by implementing the “affirmative express consent” requirements of a 2011 settlement with the Federal Trade Commission—the GDPR also contains clear requirements on consent. Facebook must ensure that users can understand how their data, from location information to sensitive information about medical conditions determined through their likes, could be shared. This includes informing users of how other users could expose or share their data, as was the case in the Cambridge Analytica leak (in that case, users who took a quiz shared information about their friends as well..)

In the wake of the Cambridge Analytica story, blogs and newspapers published dozens of articles with step-by-step instructions on how to control user data. But this shouldn’t be necessary. Instead, Facebook should take steps like creating plain language about their practices and all potential avenues of data sharing and use, making it physically easy to read on the site, and putting it in every user’s feed for review.

When it comes to content, the site needs to start being clear about what exactly it is taking down and why. At a macro level, the company should include information on takedowns in transparency reports. At the micro level, it needs to provide users with notices and communications that are specific and easy to understand. WITNESS knows this from the many hours we have spent working with users who do not understand why their content was taken down. They receive notices that give them little to no information. For example, Palestinian activists sharing hundreds of videos of human rights abuses on their accounts could have their entire profiles suspended because of only 2-3 videos. Exactly how many violations would lead to account suspension isn’t clear, and activists won’t get a specific warning that their account is about the get shut down. Instead of explaining exactly how videos have violated the standards, Facebook will simply send a generic statement saying content violated community standards because of violent and graphic content. Sometimes, it feels like someone at Facebook is just flipping a coin.

But unfortunately, what is often happening is that a powerful government is working to silence activists on the platform through their contacts at Facebook. That’s why a major part of transparency for Facebook is that the company needs to be clear on what its relationship is with governments, especially those governments that manipulate the company’s rules to silence activists on the platform. Instead of defensive denials about their meetings with government officials, the company should publish statements about who they meet with and what they discuss.

Facebook also needs better processes around both content regulation and use of data. For content regulation, the platform should provide clearer processes for users who want to appeal a decision like an account termination or removal of a piece of content. These processes should provide some basic due process for Facebook users facing Facebook decisions, including proper notice of violations and evidence for violations, the chance for users to provide their own evidence, and a clear written explanation for the results of such a process.

Similarly, users deserve more process when it comes to their data. Under the GDPR, Facebook will have to hire a “Data Protection Officer”, and users “may contact the data protection officer with regard to all issues related to processing of their personal data and to the exercise of their rights under [the GDPR].”  Facebook should provide DPOs for everyone, not just EU citizens.  Facebook should also provide better notifications about breaches and warnings about actions it plans on taking—such as the change to WhatsApp terms of service that allowed data sharing between WhatsApp and Facebook. Lastly, Facebook should provide all users with the ability to easily access the information the platform has about them, including information from other services like WhatsApp. Currently, users can go in to Facebook settings and see some of this information, such as likes the company is using for advertising purposes. But users wouldn’t be able to see and delete, for example, a list of all locations Facebook has ever logged them at.

“If you don’t like it, just leave”

WITNESS partners with and educates human rights defenders all over the world. And all over the world, those human rights defenders are using Facebook and its products to organize, communicate, and educate. For example, as state violence and brutal political repression surges in Brazil, activists and residents of favelas targeted by this violence continue to use Facebook as a news service, method of sharing files, alert system for information about violence and other urgent news, and a place to share “know your rights” information. And as we know from our regional work in Meso and South America, Africa, the Middle East/North Africa, and Southeast Asia, when activists aren’t using Facebook, they’re likely to use WhatsApp to share incredibly sensitive media, oftentimes captured in places like Syria where there are no ordinary journalists.

It’s easy for people in positions of privilege—especially those with easy access to many forms of communication, those with economic privilege, and those who aren’t facing serious human rights abuses on a daily basis—to excuse Facebook’s behavior by saying something along the lines of “Facebook is a business providing a free service. Its business model depends on what might seem like the unethical use of user data to you—but that use is legal. If people don’t like it, they shouldn’t use the platform.”

The economic and Internet access component to deleting Facebook is clear.  Telling people to get off of Facebook doesn’t take into account mobile service packages that have special Facebook and WhatsApp data plans that can save people money. It also doesn’t take into account that Facebook often works where other services and products continue to fail- especially in places with poor Internet connections.

But WITNESS knows better. It would be easy to write thousands of words about Facebook’s problems, and many of our fellow civil society organization have. We have also warned about the privacy, security, and expression problems presented by Facebook and associated tools. But the fact is that human rights defenders are using Facebook, and will continue even if we advise them not to (which would be irresponsible, for the reasons listed above).

We support efforts to develop and make truly usable alternatives to major platforms. But in the meantime, we will continue to fight for a Facebook that respects, protects, and supports human rights defenders, as we have been doing through our Tech + Advocacy program. We will keep a close eye on policy and legal developments, and we will join the fight where appropriate.

We know that the decisions Facebook and other platforms make have a huge impact on people’s lives, online and offline, for better or for worse. That is why our Tech + Advocacy work is more important now than ever. It isn’t just people’s privacy at stake, it’s also their physical safety, their right to assemble, and their freedom of expression.

]]>
2193773