No simple fix for American democracy...Whoever said there would be or could be??????????????
Three new papers inScienceand one inNature are the first products of a rare, intense collaboration between Meta, the company behind Facebook and Instagram, and academic scientists.
Tweaking Facebook feeds is no easy fix for polarization, studies find
Facebook and Instagram users’ political views remain steady even after short-term changes to platforms’ algorithms during the 2020 US presidential election.
Supporters of President Trump rally in New York City during his re-election campaign, when researchers tested social-media policies aimed at lessening political polarization.Credit: Stephanie Keith/Getty
Landmark research suggests that tweaking how people access news and other content on social-media platforms — to reduce the echo-chamber effect — doesn’t necessarily change their political opinions, knowledge or behaviour. The findings are the work of dozens of scholars who were given unprecedented access to an extensive trove of user data from Facebook and Instagram; both platforms are part of Meta (formerly Facebook), based in Menlo Park, California. With the company’s cooperation, the researchers also conducted multiple experiments that altered how tens of thousands of people received and shared political news and other information. The first results were published today in four papers1–4 in Science and Nature. . ."
Responding to calls for regulations requiring access to data, Meta said in a statement to Nature that the company is committed to “further transparency”, but privacy obligations to its users prevent the company from making raw data available to external researchers. Meta said that it hopes that the results of the research will “help policymakers as they shape the rules of the road for the internet — for the benefit of our democracy, and society as a whole”. doi: https://doi.org/10.1038/d41586-023-02420-z
So Maybe Facebook Didn’t Ruin Politics
Large-scale experiments on social media, run behind the scenes during the 2020 election, suggest there is no simple fix for American democracy.
“democracy intercepted,” reads the headline of a new special package in the journalScience. “Did platform feeds sow the seeds of deep divisions during the 2020 US presidential election?” Big question. (Scary question!)
The surprising answer, according to a group of studies out today inScience andNature, two of the world’s most prestigious research journals, turns out to be something like:“Probably not, or not in any short-term way, but one can never really know for sure.” There’s no question that the American political landscape is polarized, and that it has become much more so in the past few decades.
It seems both logical and obvious that the internet has played some role in this—conspiracy theories and bad information spread far more easily today than they did before social media, and we’re not yet three years out from an insurrection that was partly planned using Facebook-created tools.
The anecdotal evidence speaks volumes. But the best science that we have right now conveys a somewhat different message. . ."
Research partnership to understand Facebook and Instagram’s role in the U.S. 2020 election
___________________________________________________________________________________ For more about this initiative, read Facebook’s Newsroom post,
A Proposal for Understanding Social Media’s Impact on Elections: Rigorous, Peer-Reviewed Scientific Research
By Talia Stroud (UT Austin), Joshua A. Tucker (New York University), Annie Franco (Facebook) and Chad P. Kiewiet de Jonge (Facebook)
In the aftermath of the 2016 US Presidential election, the country was rocked by one troubling revelation after another about social media and new threats to democracy: the prevalence of false news, the activity of Russian trolls, and the Cambridge Analytica scandal. At the center of the world of social media stood Facebook, and speculation abounded about Facebook’s impact on the election and the quality of democracy in general. Academic research on how social media generally, and Facebook specifically, affect democracy and elections has proliferated since 2016. Two critical problems have prevented as much progress as we would have liked. First, increasing public concern and legal obligations related to data privacy led social media companies to restrict access to data previously used by external researchers. The second problem was that it is hard to conduct a rigorous scientific study on social media’s impact after the fact.
With the US 2020 election upon us, it is clear we need a new approach. For this election, we are integrating Facebook researchers’ intimate knowledge and access to internal Facebook data with the expertise of independent scholars to conduct scientifically rigorous research that allows these academics to ensure that the research asks the right questions and that the questions are answered with the right methods.
With these goals in mind, Facebook today announced a new project that is bringing together a team of approximately two-dozen Facebook researchers and outside scholars to study Facebook’s and Instagram’s impact on four key outcomes that have dominated public and academic attention: political participation, political polarization, knowledge and misperceptions, and trust in US democratic institutions. We are four of the team leaders of this process — two Facebook researchers and two outside academics — and we are participating in this project because we believe it is absolutely crucial that there be rigorous scientific assessments of Facebook’s and Instagram’s roles that are available to the public, journalists, and policy makers alike. Such a large scale research partnership with Facebook is a new undertaking, and one that we hope may serve as a model for others in the future. To guide the effort in a way that ensures ethical practices, transparency, and scientific rigor, we established the following ground rules:
Facebook will not have any right of pre-publication approval, and will only be entitled to check that papers do not violate legal or privacy obligations.
Facebook did not select the academic team members. The current academic team has been independently formed by those of us who Chair the North America (Stroud) and Electoral Integrity (Tucker) advisory committees of Social Science One.
All papers proposed by the academic teams will have academics as lead authors, with final say over the text of the papers.
Although Facebook is covering the costs associated with running the study itself (e.g. paying the survey vendor), none of the academic team will be financially compensated for their participation in the project.
To address privacy obligations, only Facebook employees will be able to “touch” the raw data, but both teams will work together to devise appropriate monitoring systems for assuring the scientific integrity of the research.
Transparency in the research process is paramount, and includes:
- A rapporteur with access to all researchers who will document the research process- Pre-registration of study plans before data collection that will be made public simultaneously with the publication of results- Explicit informed consent of study participants whose individual-level data will be collected and analyzed - Listing all researchers who have contributed to the project as co-authors on papers, with clear delineation of their contribution to that paper- An attempt to publish all pre-registered study designs in peer reviewed journals; those that cannot be published will be posted in public scholarly archives- Processes to provide journals publishing the papers with the ability to replicate the analyses in the papers - Publication of all papers accepted by peer reviewed journals in an Open Access format, which means they will be freely available to the public- As much of the data generated from the project as is possible -- given privacy limitations -- made available to scholars for further study in the future.
Given the crucial importance of understanding Facebook’s and Instagram’s influence on the democratic process in our country, we have committed ourselves to providing the public and policy makers with scientifically sound answers to critical questions about these platforms’ impact. We know everyone will be anxious to see the results. As we will need time to analyze the data, we anticipate that findings will be ready to be shared in the summer of 2021 at the earliest. We hope that the effort will be judged as worthwhile once the studies, designed to make sense of what has been a remarkable transformation of the political process in the digital information age, are complete.
A new research partnership between Facebook and a group of independent external researchers will study the impact of Facebook and Instagram on key political attitudes and behaviors during the U.S. 2020 elections.
Building on an initiative launched in 2018, this research aims to give us new insights and perspectives into some of the most important questions around the role of social media in democracy today:
Does social media make us more polarized as a society, or merely reflect the divisions that already exist?
Does it help people to become better informed about politics, or less?
And how does social media affect people’s attitudes towards government and democracy?
Externally, the project is led by Professors Talia Stroud at the University of Texas at Austin and Joshua Tucker at New York University.
They selected 15 additional academic researchers to collaborate on this effort, based on their expertise. The initial phase of the study begins in the summer of 2020 and ends in December. The research teams expect to publish their findings in mid-2021.
The research team is approaching this work with three central principles: independence, transparency, and consent.
Independence: The external researchers won’t be paid by Facebook and they won’t answer to Facebook either. Neither the questions they’ve asked nor the conclusions they draw will be restricted by Facebook. We’ve signed the same contracts with them that we do with other independent researchers who use our data (which is publicly posted on Social Science One’s website).
Transparency: The researchers have committed to publish their findings in academic journals in open access format, which means they will be freely available to the public. Facebook and the researchers will also document study plans and hypotheses in advance through a pre-registration process and release those initial commitments upon publication of the studies. This means that people will be able to check that we did what we said we would — and didn’t hide any of the results. In addition, to run their own analyses and further check our homework, we plan to deliver de-identified data on the studies we run. We have also invited Michael Wagner, a professor at the University of Wisconsin, to document and publicly comment on our research process as an independent observer.
Consent: We are asking for the explicit, informed consent from those who opt to be part of research that analyzes individual-level data. This means research participants will confirm both the use of their data and that they understand how and why their data will be used. Additionally, as part of our studies, we will also analyze aggregated user data on Facebook and Instagram to help us understand patterns. In addition to this, the studies — and our consent language — were reviewed and approved by an Institutional Review Board (IRB) to ensure they adhere to high ethical standards.
No comments:
Post a Comment