Ссылки за июль 2023

Did Facebook fuel political polarization during the 2020 election? It’s complicated.

(credit: Getty Images | Aurich Lawson)

Over the last several years, there have been growing concerns about the influence of social media on fostering political polarization in the US, with critical implications for democracy. But it's unclear whether our online "echo chambers" are the driving factor behind that polarization or whether social media merely reflects (and arguably amplifies) divisions that already exist. Several intervention strategies have been proposed to reduce polarization and the spread of misinformation on social media, but it's equally unclear how effective they would be at addressing the problem.

The US 2020 Facebook and Instagram Election Study is a joint collaboration between a group of independent external academics from several institutions and Meta, the parent company of Facebook and Instagram. The project is designed to explore these and other relevant questions about the role of social media in democracy within the context of the 2020 US election. It's also a first in terms of the degree of transparency and independence that Meta has granted to academic researchers. Now we have the first results from this unusual collaboration, detailed in four separate papers—the first round of over a dozen studies stemming from the project.

Three of the papers were published in a special issue of the journal Science. The first paper investigated how exposure to political news content on Facebook was segregated ideologically. The second paper delved into the effects of a reverse chronological feed as opposed to an algorithmic one. The third paper examined the effects of exposure to reshared content on Facebook. And the fourth paper, published in Nature, explored the extent to which social media "echo chambers" contribute to increased polarization and hostility.

"We find that algorithms are extremely influential in people's on-platform experiences, and there is significant ideological segregation in political news exposure," Natalie Jomini Stroud of the University of Texas at Austin—co-academic research lead for the project, along with New York University's Joshua Tucker—said during a press briefing. "We also find that popular proposals to change social media algorithms did not sway political attitudes."

Ideological segregation

Let's start with the question of whether or not Facebook enables more ideological segregation in users' consumption of political news. Sandra Gonzalez-Bailon of the University of Pennsylvania and her co-authors looked at the behavior of 208 million Facebook users between September 2020 and February 2021. For privacy reasons, they did not look at individual-level data, per Gonzalez-Bailon, focusing only on aggregated measures of audience behavior and audience composition. So the URLs they analyzed had been posted by users more than 100 times.

The results: conservatives and liberals do indeed see and engage with different sets of political news—strong ideological separation. That segregation is even more pronounced when political news is posted by pages or groups as opposed to individuals. "In other words, pages and groups contribute much more to segregation than users," said Gonzalez-Bailon. Furthermore, politically conservative users are much more segregated and are exposed to much more information on Facebook than liberal users; there were far more political news URLs seen exclusively by conservatives compared to those seen exclusively by liberals.

Finally, the vast majority of political news that Meta's third-party fact-checker program rated as false was viewed by conservatives, compared to liberals. That said, those false ratings amounted to a mere 0.2 percent, on average, of the full volume of content on Facebook. And political news in general accounts for just 3 percent of all posts shared on Facebook, so it's not even remotely the most popular type of content. "This segregation is the result of a complex interaction between algorithmic forms of curation and social forms of curation, and these feedback loops are very difficult to disentangle with observational data," said Gonzalez-Bailon of the study's findings.

Chronological vs. algorithmic feeds

To study the impact of replacing Meta's default feed ranking algorithms with chronological feeds, Andrew Guess of Princeton University and his co-authors recruited over 23,000 consenting Facebook users and more than 21,000 consenting Instagram users and divided them into two groups: treatment, where the feed was changed to a reverse chronological format for three months, and control, where the default algorithms remained in place. "This has been specifically proposed by policymakers and advocates as a way to reduce the amplification of harmful content, among other things," said Guess, and the study was designed to determine the effectiveness of such intervention strategies.

The team found that users who experienced a reverse chronological feed spent less time on the platforms, reducing their activity by about 20 percent on Facebook and 11 or 12 percent on Instagram. "In some cases, this led people to use other platforms instead," said Guess. "For instance, those on Instagram who got the chronological feed spent more time on TikTok or YouTube. Engagement on the platforms also generally went down relative to the default algorithm."

They also found that, with a chronological feed, Facebook users saw more posts from groups and pages relative to their friends. Those users were also exposed to significantly less content from like-minded sources on Facebook and increased exposure to political news and content relative to the default algorithmic feed. Guess noted that the percentage of content in Facebook feeds provided by untrustworthy sources (2.6 percent), uncivil posts (3.2 percent), or those containing slurs (0.2 percent) were quite low already—and even lower across Instagram.

But switching to a reverse chronological feed actually more than doubled Facebook users' exposure to untrustworthy sources, and that exposure increased by more than 20 percent on Instagram. Nor did switching to a chronological feed significantly alter user levels of polarization, political knowledge, self-reported political behavior, or similar key attitudes, with one exception: on-platform political behavior such as signing online petitions or mentioning political candidates in posts.

Share and reshare alike

Guess was also the lead author on the third study examining the visibility and effects of Facebook reshares, which ran in parallel with the algorithm study. The researchers recruited 23,000 consenting participants on Facebook and removed all the reshares from the feeds of a random sample of users for three months. Then they compared the effects of those users with users in the control group.

The results: removing reshares substantially decreased the amount of political news and content from untrustworthy sources that users in the random sample saw in their feed. It reduced overall content from untrustworthy sources by about a third, and it decreased the number of clicks and reactions. In fact, removing reshares reduced the proportion of political content in general in people's feeds by nearly 20 percent and the proportion of political news by more than half, although, again, the overall percentage of political news was not that high to begin with (about 6.2 percent of all content for those in the control group).

Once again, the team found that the intervention—removing reshared content from feeds—did not significantly affect political polarization or individual political attitudes. And in what Guess described as a "novel and counterintuitive finding," the results showed that removing reshared content actually decreased users' news knowledge within that random sample. "In other words, people became worse at being able to distinguish between things that actually happened in the past week and things that didn't actually happen," said Guess. Why might this happen? "We think that most of the news people get about politics in their Facebook feeds comes from reshares, and when you take those posts out, they see less potentially misleading content, but also less content from trustworthy sources as well."

One can't necessarily conclude from this that all Facebook users are set in their political attitudes and resistant to change regardless of what kind of content shows up in their feeds, according to Guess. "The people coming into our experimental study were solidified in their political attitudes," he said, much like the American population as a whole over the past 20 years or so. "Our experimental sample was even more politically knowledgable and politically engaged than average, and that would suggest potentially that these are people who would have somewhat more crystallized attitudes going in."

Slanted sources

The Nature paper was "rooted in concern about exposure to information from sources that are politically like-minded, what the public and pundits often refer to as echo chambers," said co-author Jaime Settle of William & Mary. "Some previous academic research had shown that echo chambers and web browsing were not as widespread or pernicious as many believe. And there were still several gaps in our knowledge about their existence on social media."

So the authors looked at the kinds of sources people are exposed to on Facebook to determine how slanted those sources are toward a given user's political leanings. They also conducted an intervention experiment with a subset of consenting users to determine what effect, if any, changing those sources would have on users' attitudes and opinions. They used a validated classifier to estimate individual political leanings, and other users, pages, and groups were categorized as either sharing the same political leanings (like-minded), on the other side ("cross-cutting"), or in the middle (neutral).

The first part of the study looked at how often all adult users of Facebook were exposed to content from like-minded sources from June to September 2020. The second part involved 23,377 US adult Facebook users who consented to be part of the study. They were divided into two groups, treatment and control. The team then reduced the content (of any type) from like-minded users by one-third for the treatment group from September to December 2020 to see if this intervention had any impact.

The team found that the median Facebook user gets 50.4 percent of all content from like-minded sources compared to 14.7 percent from cross-cutting sources; only 20.6 percent get more than three-quarters (75 percent) of exposures from like-minded sources. As for the second component, the authors found that reducing exposure to like-minded sources by one-third reduced exposure to uncivil/untrustworthy content. The intervention also increased exposure to content from cross-cutting sources by 20.7 percent, but that increase was actually greater (from 25.6 to 35.9 percent) for neutral sources.

Furthermore, engagement with content from like-minded sources decreased along with exposure in our experiment. However, the rate of engagement with content from like-minded sources increased when people saw relatively less of it.

"This is a reminder that it is difficult to override the psychological preferences that people have, in this case, for engaging with ideas with which they agree, just by algorithms alone," said Settle. "There's often this implicit assumption that what will replace that content is kind of the broccoli of the political information ecosystem, that somehow it's going to expose people to information that will help them better navigate the political world. But our study shows that's not really the case. I think it's a reflection of the active choices that people make about how to construct their networks: the people they choose to friend, the pages and groups they choose to engage with. So users are constrained by the connections they have made, to the extent that they are much more likely to establish connections with like-minded sources."

Perhaps the most relevant finding was that reducing exposure to like-minded sources by one-third for three months had no measurable effect on users' attitudes regarding politics, how polarized they felt with regard to the two major political parties, ideological extremity, or beliefs in false claims. Nor was there any evidence that some users were affected more than others based on such characteristics as age, gender, how long they'd had a Facebook account, political sophistication, and the like.

One might be tempted to conclude that this is a full exoneration of the most common criticisms of Facebook and other social media platforms. Brendan Nyhan of Dartmouth University, co-lead author of the Nature paper and a co-author on the other three papers, offered a more nuanced take.

“These findings do not mean that there is no reason to be concerned about social media in general or Facebook in particular,” Nyhan said. “There are many other concerns we could have about the ways social media platforms could contribute to extremism, even though exposure to content from like-minded sources did not seem to make people more polarized in the study we conducted. We need greater data transparency that enables further research into what’s happening on social media platforms and its impacts. We hope our evidence serves as the first piece of the puzzle and not the last.”

Not a viable future model?

Collaborating with a major corporation like Meta is bound to raise questions of scientific integrity. That's why the US 2020 Facebook and Instagram Election Study established several "integrity provisions" for this collaborative project with Meta, including giving the lead academic authors final control rights over the analyses and final text of the papers. Meta agreed not to block any of the results from the research—unless there were significant privacy concerns or other legal constraints, which wasn't the case with these four studies—and the independent scientists received no compensation from Meta.

The project also brought in an "independent rapporteur," Michael Wagner of the University of Wisconsin, to observe and document the entire process and publish a public comment giving his assessment of the project. While Wagner praised the overall success and trustworthiness of the project and its findings, he cautioned that it should not be a model for future collaborations between industry and academia.

There's a relevant existing history here, per Wagner, namely an industry-academic partnership called Social Science One that sought to study models of collaboration between academia and industry. Critics charged that there were major delays in data sharing from Meta for that project, ostensibly for privacy concerns, and researchers later found a major error in the Facebook data that Meta provided. Wagner reported that at least one funder described Social Science One as a "rope a dope," in the sense that "Meta strings researchers along with a promise for data that never comes, or comes with untenable compromises." And of course, the Cambridge Analytica scandal badly damaged Meta's reputation.

Meta's own researchers were keen to participate in a joint project with respected scholars involving significant scientific questions, approached with rigorous analysis, while maintaining users' privacy. Wagner was careful to emphasize that Facebook researchers are ethical professionals. However, they are nonetheless still corporate employees. He talked to a former Meta employee who cautioned that the internal researchers would answer academics' questions honestly but would not volunteer additional information—and without internal knowledge of how a social media platform operates, researchers might not ask the right questions.

Wagner emphasized that Meta "demonstrated a strong commitment to rigorous social scientific research," investing more than $20 million in the project and redirecting the time of several dozen employees to work on it, and there were no advance leaks of the study results. "But Meta set the agenda in ways that affected the overall independence of the researchers," Wagner wrote, namely in how they framed and hence influenced workflow choices.

And while the external researchers had the final say regarding the relevant papers, closer to the publication date, Meta co-authors requested that they be able to express their disagreement with some of those final interpretations. This request was denied by Stroud and Tucker, who suggested the researchers could have their names removed as co-authors if they strongly disagreed with the findings–standard practice in academia.

"Scholarship is not wholly independent when the data are held by for-profit corporations, nor is it independent when the same corporations can limit the nature of what is studied," Wagner concluded. "For social science research about the effects of social media platforms and their use to be truly independent, outside academics must not be wholly reliant on the collaborating industry for study funding; must have access to the raw data that animates analyses; must be able to learn from internal platform sources about how the platforms operate; and must be able to guide the prioritization of workflow." He suggested that funding partnerships, or perhaps government regulation and data-sharing requirements—akin to the European Union's Digital Services Act—would be helpful for future scholarship.

DOI: S. González-Bailón et al., "Asymmetric ideological segregation in exposure to political news on Facebook," Science, 2023. 10.1126/science.ade7138  (About DOIs).

DOI: A.M. Guess et al., "How do social media feed algorithms affect attitudes and behavior in an election campaign?" Science, 2023. 10.1126/science.abp9364  (About DOIs).

DOI: A.M. Guess et al., "Reshares on social media amplify political news but do not detectably affect beliefs or opinions," Science, 2023. 10.1126/science.add8424  (About DOIs).

DOI: B. Nyhan et al., "Like-minded content on Facebook might not affect political affiliation," Nature, 2023. 10.1038/s41586-023-06297-w  (About DOIs).

25 km/h — 7/10

Another German road movie, this time about fulfilling childhood dreams. For these two brothers it wasn’t too late, even after a couple of decades, but some results were odd. On the other hand, some other, unexpected outcomes were quite good! The director got the feel good mood nicely, but could not avoid a few rude, even macabre jokes. Still, I think that the skill of German filmmakers keeps improving.

Dungeons & Dragons: Honor Among Thieves — 7/10

A lively and giddy action movie which takes a few ideas from D&D. Good humor, memorable characters, and captivating plot go well with creative action scenes. And it doesn’t take itself too seriously!

https://inoy-dmitriy.livejournal.com/61213.html

После мятежа 24 июня я осознал себя агностиком не только в религии, но и в политике.

Не бойся будущего, оно не настоящее

← предыдущий месяц