Jump to content

Asymmetric ideological segregation


IsntLifeFunny

Recommended Posts

I'm not sure this article has been discussed here, probably has been mentioned, but it's a truly interesting read. This is about social media, but I found it while using Chat GPT thinking about how their algorithm is assymetrical in its response based on the user, and how dangerous that is, which led me to this Science Journal where they played it out in real time with Facebook and their algorithm.

 

 

 

The results shouldn't shock anyone, but it's just so fascinating to watch and read from a strictly scientific understanding. This is proof that the Right insulates and silos itself, deliberately, from the truth. Social media and AI firms fully understand the phenomenon at this juncture, but their entire business model is designed for it. The amplification of disinformation is asymmetrically designed and implemented uniquely to Republicans.

 

Our analyses show that both algorithmic and social amplification play a part in increasing ideological segregation. Algorithmic amplification refers to data-driven automated processes that result in some content being more visible in users’ feeds; social amplification refers to choices made by users that also grant more visibility to specific content through sharing and reposting. We show that these processes operate asymmetrically across the US political “right” (conservatives or Republican Party) and the political “left” (liberals or Democratic Party), with the presence of much more homogeneous news consumption on the right—a pattern that has no parallel on the left...

 

Finally, our results uncover the clearly asymmetric nature of political news segregation on Facebook—the right side of the distributions for potential, actual, and engaged audiences looks robustly different from the left side. Thus, although there are homogeneously liberal and conservative domains and URLs, there are far more homogeneously conservative domains and URLs circulating on Facebook. This asymmetry is consistent with what has been found in other social media platforms (24–26). We also observe on the right a far larger share of the content labeled as false by Meta’s 3PFC. Overall, these patterns are part of a broader set of long-standing changes associated with the fracturing of the national news ecosystem, ranging from Fox News to talk radio, but they are also a manifestation of how Pages and Groups provide a very powerful curation and dissemination machine that is used especially effectively by sources with predominantly conservative audiences (14).

https://www.science.org/doi/10.1126/science.ade7138


 

Link to post
Share on other sites

  • 2 months later...
18 minutes ago, reo said:

 

 

Here's the Abstract to the study:

 

Feed algorithms are widely suspected to influence political attitudes. However, previous evidence from switching off the algorithm on Meta platforms found no political effects1. Here we present results from a 2023 field experiment on Elon Musk’s platform X shedding light on this puzzle. We assigned active US-based users randomly to either an algorithmic or a chronological feed for 7 weeks, measuring political attitudes and online behaviour. Switching from a chronological to an algorithmic feed increased engagement and shifted political opinion towards more conservative positions, particularly regarding policy priorities, perceptions of criminal investigations into Donald Trump and views on the war in Ukraine. In contrast, switching from the algorithmic to the chronological feed had no comparable effects. Neither switching the algorithm on nor switching it off significantly affected affective polarization or self-reported partisanship. To investigate the mechanism, we analysed users’ feed content and behaviour. We found that the algorithm promotes conservative content and demotes posts by traditional media. Exposure to algorithmic content leads users to follow conservative political activist accounts, which they continue to follow even after switching off the algorithm, helping explain the asymmetry in effects. These results suggest that initial exposure to X’s algorithm has persistent effects on users’ current political attitudes and account-following behaviour, even in the absence of a detectable effect on partisanship.

 

https://www.nature.com/articles/s41586-026-10098-2

 

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...