August 6, 2018 Mark Zuckerberg Chief Executive Officer Facebook 1 Hacker Way Menlo Park, CA 94025 Dear Mr. Zuckerberg, We write to propose that Facebook amend its terms of service to create a safe harbor for certain journalism and research on its platform. As you know, Facebook influences public discourse in ways that are not fully understood by the public or even by Facebook itself. Journalists and researchers play a crucial role in illuminating this influence. Facebook’s terms of service, however, severely limit their ability to do that work, by prohibiting them from using basic tools of digital investigation on Facebook’s platform. We have attached a proposed amendment to the terms of service that would establish a safe harbor for certain kinds of journalism and research while appropriately protecting the privacy of Facebook’s users and the integrity of Facebook’s platform. The safe harbor is limited by design, and adoption of the proposed amendment would not substitute for disclosure of information to journalists, researchers, and the general public through other channels. We believe, however, that Facebook’s establishment of the safe harbor would meaningfully expand the space for digital journalism and research that is especially urgent. Because we are sending this letter on behalf of journalists and researchers who would like to pursue projects that are manifestly in the public interest but are barred by Facebook’s terms of service, there is some urgency to our proposal. We would appreciate a response to this letter by September 7, 2018.1 1 We are sending this letter on behalf of Kate Conger, a general assignment technology reporter for the New York Times; Cameron Hickey, an Emmy Award–winning journalist and documentarian who has produced science and technology stories for the PBS NewsHour; Kashmir Hill, a senior reporter for the Special Projects Desk at Gizmodo Media Group; Arvind Narayanan, Associate Professor of Computer Science at Princeton University; and Aviv Ovadya, former Chief Technologist at the Center for Social Media Responsibility, University of Michigan School of Information (institutional affiliations are provided for identification purposes only). 475 Riverside Drive, Suite 302, New York, NY 10115 knightcolumbia.org (646) 754-8500 * * * Facebook has a singular influence on public discourse in the United States and globally. Facebook has more than two billion users, including two-hundred million in the United States. These users rely on Facebook to connect with each other as well as with businesses, advocacy organizations, and governments. They depend on Facebook for news, including news about politics, political candidates, and elections. Facebook shapes how individuals engage with one another and with the communities around them, with profound implications for society. For this reason, there is extraordinary public interest in how Facebook works—for example, what relationships it encourages or discourages, what ways of interacting it permits or prohibits, what information it highlights or suppresses, and what communities it facilitates or forecloses. Digital journalism and research are crucial to the public’s understanding of Facebook’s platform and its influence on our society. Many of the most important stories written about Facebook and other social media platforms in recent months have relied on basic tools of digital investigation. For example, research published by an analyst with the Tow Center for Digital Journalism, and reported in The Washington Post, uncovered the true reach of the Russian disinformation campaign on Facebook. An investigation by Gizmodo showed how Facebook’s “People You May Know” feature problematically exploits “shadow” profile data in order to recommend friends to users. A story published by ProPublica revealed that Facebook’s self-service ad platform had enabled advertisers of rental housing to discriminate against tenants based on race, disability, gender, and other protected characteristics. And a story published by the New York Times exposed a vast trade in fake Twitter followers, some of which impersonated real users.2 Facebook’s terms of service limit this kind of journalism and research because they ban tools that are often necessary to it—specifically, the automated collection of public information and the creation of temporary research accounts. Automated collection allows journalists and researchers to generate statistical insights into patterns, trends, and information flows on Facebook’s platform. Temporary research accounts allow journalists and researchers to assess how the platform responds to different profiles and prompts. Journalists and researchers who use tools in violation of Facebook’s terms of service risk serious consequences. Their accounts may be suspended or disabled. They risk legal liability for breach of contract. The Department of Justice and See Nicholas Confessore, Gabriel J.X. Dance, Richard Harris & Mark Hansen, The Follower Factory, N.Y. Times, Jan. 27. 2018; Kashmir Hill & Surya Mattu, Keep Track of Who Facebook Thinks You Know with This Nifty Tool, Gizmodo, Jan. 10, 2018; Craig Timberg, Russian Propaganda May Have Been Shared Hundreds of Millions of Times, New Research Says, Wash. Post, Oct. 5, 2017 (citing research by Jonathan Albright); Julia Angwin & Terry Parris, Jr., Facebook Lets Advertisers Exclude Users by Race, ProPublica, Oct. 28, 2016. 2 2 Facebook have both at times interpreted the Computer Fraud and Abuse Act to prohibit violations of a website’s terms of service.3 We are unaware of any case in which Facebook has brought legal action against a journalist or researcher for a violation of its terms of service. In multiple instances, however, Facebook has instructed journalists or researchers to discontinue important investigative projects, claiming that the projects violate Facebook’s terms of service. As you undoubtedly appreciate, the mere possibility of legal action has a significant chilling effect.4 We have spoken to a number of journalists and researchers who have modified their investigations to avoid violating Facebook’s terms of service, even though doing so made their work less valuable to the public. In some cases, the fear of liability led them to abandon projects altogether.5 It is against this background that we propose that Facebook amend its terms of service to establish a safe harbor for certain public interest journalism and research. We understand that, in the wake of revelations concerning the Cambridge Analytica scandal, Facebook is facing new pressure to protect the data that users entrust to it. This pressure is warranted and indeed overdue. Addressing legitimate privacy concerns, however, need not entail the obstruction of public-interest journalism and research. Moreover, it is important to remember that journalism and research often directly serve the interests of Facebook’s users by exposing privacy abuses and security risks. We appreciate that Facebook has recently announced new transparency initiatives, including a partnership with the Social Science Research Council (SSRC). These efforts are laudable, but they are also insufficient. The SSRC initiative, for example, is focused on peer-reviewed research rather than journalism, and limited, at least initially, to the impact of social media on democracy and elections. Facebook’s adoption of our proposed amendment would See, e.g., Sandvig v. Sessions, No. CV 16-1368-JDB, 2018 WL 1568881, at *11 (D.D.C. Mar. 30, 2018) (“Here, the government has implicitly—and in past prosecutions, explicitly—read the [CFAA] to include ToS violations.”); Pl.’s Opp’n to Mot. to Dismiss at 6–7, Facebook, Inc. v. Maxbounty, Inc., No. 10-CV-4712-JF (N.D. Cal. Sept. 14, 2011); Pl.’s Opp’n to Mot. to Dismiss at 6–7, Facebook, Inc. v. Power Ventures, Inc., No. 08-CV-05780 (N.D. Cal. Apr. 17, 2009). 3 See, e.g., Sandvig v. Sessions, 2018 WL 1568881, at *3 (noting that researchers have been chilled from testing whether platforms’ algorithms violate anti-discrimination laws). 4 To the extent that the CFAA bars journalists and researchers from undertaking the kinds of investigations described here, the statute violates the First Amendment. See, e.g., Victoria Baranetsky, Data Journalism and the Law, Tow Center for Digital Journalism, August 2018 (forthcoming); Esha Bhandari & Rachel Goodman, ACLU Challenges Computer Crimes Law That Is Thwarting Research on Discrimination Online, ACLU, June 29, 2016. Enforcing Facebook’s terms of service against journalists and researchers engaged in this work would also raise serious public policy concerns. 5 3 be an important supplement to the SSRC initiative and to other initiatives Facebook has already announced. The safe harbor we envision would permit journalists and researchers to conduct public-interest investigations while protecting the privacy of Facebook’s users and the integrity of Facebook’s platform. Specifically, it would provide that an individual does not violate Facebook’s terms of service by collecting publicly available data by automated means, or by creating and using temporary research accounts, as part of a news-gathering or research project, so long as the project meets certain conditions. First, the purpose of the project must be to inform the general public about matters of public concern. Projects designed to inform the public about issues like echo chambers, misinformation, and discrimination would satisfy this condition. Projects designed to facilitate commercial data aggregation and targeted advertising would not. Second, the project must protect Facebook’s users. Those who wish to take advantage of the safe harbor must take reasonable measures to protect user privacy. They must store data obtained from the platform securely. They must not use it for any purpose other than to inform the general public about matters of public concern. They must not sell it, license it, or transfer it to, for example, a data aggregator. And they must not disclose any information that would readily identify a user without the user’s consent, unless the public interest in disclosure would clearly outweigh the user’s interest in privacy. Projects involving the use of temporary research accounts must incorporate reasonable measures to avoid misleading human Facebook users. This means, for example, that temporary research accounts should generally be identified as such in the accounts’ public profiles. Finally, the project must not interfere with the proper working or appearance of Facebook. The safe harbor is intended for investigations of Facebook’s platform, not efforts to disrupt it. Individuals can take advantage of the safe harbor only if their projects do not disable, overburden, or significantly impair Facebook’s proper functioning. We have drafted and attached a proposed amendment to Facebook’s terms of service that would give effect to this design. We emphasize that the proposed amendment is intended as a proof of concept. Some of its terms may require further definition. We also expect that, if Facebook were to adopt the amendment, Facebook would refine the amendment’s language over time to clarify the scope of the safe harbor and limit any abuses of it. We have proposed specific language as a starting point in the hope that the specificity of the proposal will help clarify the concept behind it. If you are sympathetic to the concept, we would be willing to 4 work with Facebook’s legal and policy teams to develop and refine the amendment’s specific language. * * * For the reasons alluded to above, there is some urgency to our proposal. We are sending this letter on behalf of journalists and researchers who would like to undertake investigations that are in the public interest but are barred by Facebook’s terms of service. Some of these investigations are time-sensitive, as they involve, for example, the upcoming elections in the United States. Facebook’s establishment of the safe harbor would allow these journalists and researchers to undertake their investigations without fear of legal sanction. As indicated above, we would appreciate your response to this letter by September 7, 2018. Thank you for your consideration. Sincerely, Jameel Jaffer Alex Abdo Ramya Krishnan Carrie DeCell Knight First Amendment Institute at Columbia University 475 Riverside Drive, Suite 302 New York, NY 10115 jameel.jaffer@knightcolumbia.org (646) 745-8500 cc: Sheryl Sandberg Chief Operating Officer Colin Stretch General Counsel Monika Bickert Head of Global Policy Managment Stephen Satterfield Director of Privacy and Public Policy 5 Knight Institute proposed safe harbor 1. Safe harbor for certain news-gathering and research projects: Notwithstanding anything else in the Terms, you do not violate the Terms by collecting public information through automated means, or by creating or using temporary research accounts, as part of a newsgathering or research project, so long as you meet all of the following conditions. 1. The purpose of your project is to inform the general public about matters of public concern. 2. With respect to data that is collected through automated means, you take the following measures to protect the privacy of Facebook users: § You take reasonable measures to prevent the theft or accidental disclosure of the data. § You do not use the data for any purpose other than to inform the general public about matters of public concern. For example, you do not: § • Sell or license the data. • Directly or indirectly transfer any data (including anonymous, aggregate, or derived data) to any ad network, data broker, or other advertising or monetization-related service. You do not publish or otherwise disclose any data that readily identifies a Facebook user without that user’s consent unless the public interest in disclosure clearly outweighs the privacy interest of the user (for example, where the data concerns a public figure or shows that the user is engaged in serious unlawful activity). 3. If your project entails the creation of a temporary research account, you take reasonable measures to avoid misleading human Facebook users through that account. A temporary research account should generally be identified as such in the account’s public profile. 4. You do not do anything that could disable, overburden, or significantly impair the proper working or appearance of Facebook.