Rocksolid Light

News from da outaworlds

mail  files  register  groups  login

Message-ID:  

BOFH excuse #26: first Saturday after first full moon in Winter


alt / alt.atheism / Why Rightist Drivel & Lies Must Be Censored On Social Media

SubjectAuthor
o Why Rightist Drivel & Lies Must Be Censored On Social MediaRichA

1
Subject: Why Rightist Drivel & Lies Must Be Censored On Social Media
From: RichA
Newsgroups: alt.fan.rush-limbaugh, talk.politics.guns, alt.atheism
Organization: To protect and to server
Date: Sun, 20 Oct 2024 20:05 UTC
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!newsfeed.bofh.team!paganini.bofh.team!not-for-mail
From: rander3127@gmail.com (RichA)
Newsgroups: alt.fan.rush-limbaugh,talk.politics.guns,alt.atheism
Subject: Why Rightist Drivel & Lies Must Be Censored On Social Media
Date: Sun, 20 Oct 2024 20:05:41 -0000 (UTC)
Organization: To protect and to server
Message-ID: <vf3nql$27qj1$2@paganini.bofh.team>
Injection-Date: Sun, 20 Oct 2024 20:05:41 -0000 (UTC)
Injection-Info: paganini.bofh.team; logging-data="2353761"; posting-host="ki7kKm3Bzi8PWLqLpqQ92w.user.paganini.bofh.team"; mail-complaints-to="usenet@bofh.team"; posting-account="9dIQLXBM7WM9KzA+yjdR4A";
User-Agent: Xnews/5.04.25
X-Notice: Filtered by postfilter v. 0.9.3
View all headers

�Right-wing bias�: A macro study confirms that Facebook disinformation is
consumed by conservatives
According to analysis of data from 208 million US citizens, which Meta
allowed a team of academics to access, 97% of fake news is seen by right-
wing users. The investigation did not find clear links between social
media and political polarization
Facebook is a network dominated by conservative news and its right-wing
users are the ones who overwhelmingly consume information labeled as
false. The data confirming these two hypotheses comes from academic
research that had unprecedented access to internal Facebook data provided
by Meta. These findings are based on the aggregated activity of 208
million U.S. users over several months around the 2020 US elections. The
study, led by Spanish researcher Sandra Gonz�lez-Bail�n of the University
of Pennsylvania, is part of a series of four papers analyzing Meta social
media�s impact on increasing political polarization, which were published
Thursday in the journals Science and Nature.

�I didn�t expect to find some of the results we found, with such radical
patterns,� Gonz�lez-Bail�n tells EL PA�S via videoconference from
Philadelphia. �But this is what the data say,� she adds. The article
examines how the combination of user behavior and the social media
website�s algorithm segregates information consumption between
progressives and conservatives. Although these two groups exist, they are
not symmetrical, as previously believed: �Audiences who consume political
news on Facebook are, in general, right-leaning,� the article says. But
the most striking figure is the difference in the reach of news labeled
as fake by Meta�s fact-checkers (which accounts for only 3% of the total
number of links shared on Facebook): 97% of those links circulate among
conservative users.

�It is true that it is the most controversial article,� University of
Konstanz (Germany) Professor David Garc�a admits to EL PA�S; he had
access to the embargoed pieces to write a brief commentary in Nature.
�But it is very important. The evidence we had [before] was not as solid.
There was a 2015 study that had problems with it. They got it right, as
we all would have wanted to do.�

The entire investigation�s impact goes beyond that: �It�s not that much
of a surprise. Facebook is more conservative. But what is impressive is
that someone was able to verify it from outside Facebook with access to
[the company�s] internal data, although the results are not very
[unflattering to] Facebook,� he explains, referring primarily to the
other three investigations published at the same time. That research
analyzes the problems with algorithmic timelines (feeds) on Instagram and
Facebook, the risks of virality and sharing posts, and content received
from ideologically like-minded people. None of the three has found clear
results that point to easy solutions or culprits.
A complicated answer

For years, experts, technologists and academics have been trying to
understand how social media affects our societies. In little more than a
decade, the way we inform ourselves has changed dramatically: that must
have consequences, but what are they? Although these articles try to
answer that question, it is not easy to create a parallel world to
compare and see where we would be today without Facebook, Twitter (X as
of this Monday) and YouTube. �These findings can�t tell us what the world
would have been like if we hadn�t had social media for the last 10 to 15
years,� Joshua Tucker, a professor at New York University and one of the
academic leaders of the project, admitted at a virtual press conference.

�We can�t decouple the algorithmic from the social.�
Sandra Gonz�lez-Bail�n, University of Pennsylvania.

�The question of whether social media is destroying democracy is very
complicated. It�s a puzzle and each of these articles is a piece [of
it],� says Gonz�lez-Bail�n. These four articles are just the first of a
total of 16, which will continue to be published in the coming months and
are additional pieces of that huge puzzle. The project stems from an
August 2020 agreement between Meta and two professors, who then selected
the rest of the researchers. �I have never been part of a project where
the standards of analytical rigor, fact- and code-checking have been so
robust and, therefore, a project [that ensures] that quality control and
�the results are genuine,� adds Gonz�lez-Bail�n. The authors include
academics, who are completely independent of Meta, as well as Meta
employees. Meta�s internal leader for this investigation is Spanish
researcher Pablo Barber�.
What if we removed the algorithm?

The other three articles looked at what would happen on Facebook and
Instagram if three details to which political polarization and the
creation of bubbles are often attributed were changed. The researchers
received permission from 20,000 participants to change their timeline
content and then compared them to a control group. The experiments were
conducted for three months, between September and December 2020, around
the time of the election in which Joe Biden won the presidency. While the
numbers may seem small, researchers highlighted both the sample and the
duration of the experiment as unusual and quite valuable.

The first of these papers measures the impact of replacing Facebook�s and
Instagram�s algorithms (the order of what we see on our screen) with a
simple chronological order: the last post published is the first we see
and so on; the algorithm placing the most �interesting� items at the top
has been removed. That is an obvious way to measure whether the famous
�algorithm� is confusing us. For example, it looks at whether seeing the
most extreme political content most often � because it provokes more
interest than moderate content � polarizes us. The result is that it
hardly affects polarization or users� political knowledge.

That doesn�t mean that the change doesn�t have other consequences. The
reduction of algorithmic content causes users to spend less time on the
two Meta networks, presumably because the content is more boring or
repetitive; that prompts them to go to TikTok or YouTube. In addition,
users with chronological content saw more items from unreliable and
political sources.
Avoiding shared stories

In another article, the authors removed some of the shared content on
Facebook. The intent was to reduce the importance of virality. Again,
there were no substantial changes, but there were �unexpected
consequences� that were difficult to foresee, according to Andrew Guess,
the Princeton University professor who led the study: �People go on to
fail to distinguish between things that happened last week and things
that didn�t. Why? Most of the news people get about politics on Facebook
comes from sharing, and when you remove those posts, they see less
virality-prone and potentially misleading content, but they also see less
content from trusted sources,� Guess adds. The change decreased users�
awareness of current events without affecting other variables, which does
not seem to be a positive change.

�Users have their own initiative and their behavior is not completely
determined by algorithms.�
David Garcia, University of Konstanz

The third paper, the only one published in Nature, tries to reduce the
presence of content from ideologically like-minded users. Again, the
results of the study do not reveal substantial changes, but users whose
like-minded content was cut off ended up interacting more with the
content they did see: �Users found other ways to read like-minded
content, for example, through groups and channels or by scrolling down in
the Facebook feed. This shows that users have their own initiative and
that their behavior is not completely determined by algorithms,� Garcia
writes. While the researchers rule out what they call �extreme echo
chambers,� they did observe that 20% of Facebook users receive 75% of
content from like-minded accounts. Reducing that like-minded content, the
authors write, does not cause substantial variations in polarization or
ideology.
How to separate life from social media?

There are several potential problems with these studies: the responses
were self-reported and the period and country they examined may have led
to a result that cannot be duplicated as-is in other circumstances. The
most obvious conclusion? It is difficult to isolate and measure a
phenomenon with as many ramifications as political polarization, although
it does show that tinkering with social media does not change its impact
� positive or negative � on democracy.

Even in the Gonz�lez-Bail�n-led paper, which has the strongest evidence
and focuses on observing the entire network, scholars detect that
liberals and conservatives consume different information diets. But they
don�t know whether that diet is caused by the algorithm or by the
individual�s prior opinions. �Our paper�s major contribution is that we
used everything that happened on the platform, focusing on political news
links, and that�s a very strong point,� Gonzalez-Bailon says. �But we
can�t decouple the algorithmic from the social. Ultimately, algorithms
learn from user behavior, and that�s the loop we can�t quite break,� she
adds.


Click here to read the complete article
1

rocksolid light 0.9.8
clearnet tor