The media fervor that followed special counsel Robert Mueller’s indictment of a Russian internet organization for intentionally sowing discord in the US political system has missed one crucial implication: We still know far too little about the potential impact of social media on individuals and society.
Catherine Brooks (@catfbrooks) is an associate professor of information at the University of Arizona, where she is also the associate director of the School of Information and founding director of the Center for Digital Society and Data Studies. She is a Public Voices fellow with the Op Ed Project.
Ironically, this problem could be easily addressed. If Facebook truly embraces the vision, expressed in January by co-founder and CEO Mark Zuckerberg, to make products that are “good for people’s well being,” the company needs to provide academic researchers far broader access to its data.
Zuckerberg has voiced his concern about recent findings that social media causes social harm. He has announced that Facebook aims to tweak its search algorithm again, this time so users would experience the most “meaningful interactions” with friends and family, instead of seeing the “most meaningful content”—that is, news (and sometimes fake news).
Facebook, however, continues to resist what may be the best way to learn more about meaningful interactions, digital propaganda, and aspects of social media that could be good for people: sharing its vast amount of anonymized data with a broad set of academic researchers.
Currently, the company shares data with a select few, which limits society’s ability to analyze and understand online behaviors related to elections, mass demonstrations, political attitudes, cyberbullying, identity theft, and much more. As it stands, scholars must depend on sometimes-awkward workarounds (user surveys and algorithm audits, for example) to study Facebook’s societal impact.
Social scientists want to know why stories go viral, who holds political influence, what shapes political and social attitudes, and whether social media can change those attitudes. We want to understand echo chambers and fake news, and why users join groups like ISIS or domestic white supremacy organizations online. Access to the company’s enormous data sets for research purposes would offer unprecedented opportunities to understand more about online human interaction and behavior.
Such broadened access for scholars would vastly expand public knowledge and understanding about our own behavior (e.g., the online mechanisms for political polarization, impediments to civil discourse, mass cyberbullying activities that target vulnerable youth, and propaganda types that sometimes lead to fake information going viral) in this increasingly digital society.
To be fair, Facebook does share data with a select few scholars. For instance, Facebook is providing data to Stanford economist Raj Chetty, a Silicon Valley insider and “a favorite among tech elites.” Though Chetty’s research on inequality is valuable, those of us working on public campuses across the United States and around the globe have scholarly questions just as Chetty does. Confining research to a very select few, and within the Silicon Valley circle, severely restricts the kinds of questions and analyses such data could inform.
It’s understandable that Facebook would hesitate to share individuals’ data. Privacy and security are serious and growing concerns for users and tech companies alike. Privacy advocates are right to remain focused on protecting user data across platforms like Facebook. But data can be anonymized to protect distinct users—those identities are not necessary for the study of trends and behaviors.
Some social media companies are already finding creative, safe ways to share data. LinkedIn, for example, launched the Economic Graph Research program in 2014. This ongoing program invites scholars to submit proposals for using LinkedIn data to generate new insights. Although programs like these may also be a concern for privacy experts, the data shared are stripped of user identities before handed to researchers and can illuminate hiring trends, skills gaps, gender differences in organizational advancement, and the impacts of professional networks for employees.
Society is changing rapidly, online and off. Facebook has the power to help us understand those changes and make better choices for the common good. Stockholders, policymakers, scholars, and Facebook users alike should lean on the company and ask for broader access to what is likely the largest social media data collection in the world.
WIRED Opinion publishes pieces written by outside contributors and represents a wide range of viewpoints. Read more opinions here.