Pubblicato in: Criminalità Organizzata

Facebook’s filter bubble. Un nuovo strumento di potere e condizionamento.

Giuseppe Sandro Mela.

2017-02-08.

biancaneve-e-la-strega

Se si pensasse a quanto si sia rimasti stupiti nel leggere il carteggio intercorso tra il card Cusano e Papa Paolo II, verosimilmente comprenderemmo meglio il prosieguo. Il Cusano aveva fatto venire dalla Germania Conrad Schweynheym e Arnold Pannartz, due collaboratori di Gutenberg che stamparono in tiratura 275 il Donato Minore, il De Oratione, ed il De Civitate Dei: ne rimase entusiata e ne fece ampie relazioni al Santo Padre. Il Papa, uomo santo ma molto pratico, vedeva invece nella stampa tutti i potenziali pericoli, e vide lontano: la riforma di Luther sarebbe stata ben difficile senza l’ausilio di quel nuovo mezzo di comunicazione di massa.

*

L’esperienza luterana radicò per secoli nella mente che il controllo dei mezzi di stampa equivalesse al controllo dei popoli. Controllo quindi delle tipografie, dei giornalisti e di quanto fosse stato stampato.

Il primo grande scossone a simile credenza avvenne sotto il regime sovietico. Nonostante che tutti i mezzi di comunicazione fossero asserviti al potere egemone, la gente comune non concedeva loro alcun credito.

In poche parole, una propaganda battente e sostanzialmente falsa e contraddittoria aveva esitato in un effetto contrario a quello perseguito. Siamo debitori a Suslov delle prime trattazioni organiche della disinformacija, ma per fortuna dell’umanità i politici non reputarono tempo bene impiegato studiare i suoi trattai sull’argomento.

*

Negli ultimi anni abbiamo assistito al ripetersi, mutatis mutandis, di quanto accaduto nell’Unione Sovietica.

Le elezioni in Mecklenburg-Vorpommern, Sachsen-Anhalt, Rheinland-Pfalz, Baden-Württemberg e Berlin lo avevano preannunciato. Il referendum inglese, le elezioni presidenziali in Austria e l’esito del referendum in Italia lo avevano ribadito. Ma le elezioni presidenziali e politiche negli Stati Uniti lo hanno reso evidente ai ciechi.

Anche con il totale controllo liberal dei media, televisioni e giornali, questi non erano riusciti a condizionare le intenzioni di voto degli elettori.

Sicuramente la tediosa ripetitività del pensiero unico espresso per di più nella terminologia del ‘politicamente corretto’ aveva saturato la gente comune, il cumulo di menzogne inverosimili aveva tracimato, ma era anche intervenuto un fatto nuovo, del tutto fuori controllo: internet ed i social-media.

* * * * * * * *

«un calo del 41,6% della carta stampata in circolazione negli Stati Uniti dal 2005 a oggi».

«Dal 2006 le entrate pubblicitarie sono piu’ che dimezzate a quota $22,3 miliardi»

Gestire un giornale è espensivo e la maggior parte delle grandi testate chiude i bilanci in perdita. I giornalisti sono un costo elevato. Le copie non sono vendute a prezzi tali da consentirne una ampia diffusione. Le versioni digitali sono quasi invariabilmente a pagamento.

Non solo. Negli Stati Uniti vi sono circa 324 milioni di persone. Secondo il Poynter, Il Wall Street Journal tira 2.1 copie al giorno, Usa Today 1.8, New York Time 1.2, New York Daily News 600,000, Los Angeles Times 601,000. Il Washington Post non tira più di 500,000 copie.

Grosso modo, un americano su cento legge un quotidiano, e lo legge usualmente solo parzialmente.

Non solo. La lettura di un quotidiano è quasi completamente appannaggio di quanti abbiano un livello di laurea o superiore.

In pochissime parole: la carta stampata è autoreferenziale e l’americano medio non ne può essere influenzato per il semplice motivo che non la legge.

*

I dati sulla televisione sono ancor più sconfortanti. Se le soap-opera sono molto seguite, già i telegiornali hanno una scarsa audience. I dibattiti televisivi ad argomento sociale o politico sono la causa più frequente di cambio di canale: oltre il 90% degli ascoltatori sono persone che si occupano attivamente di politica. La gente comune li snobba.

Anche in questo caso il mezzo televisivo è autoreferenziale e non raggiunge che in quote minime il così detto Joe l’idraulico.

*

Mr Donald Trump ha puntato poco o punto su giornali e televisioni, al contrario Mrs Hillary Clinton ha speso un capitale in interventi su questi mezzi, con i risultati che si sono potuti constatare. I suoi messaggi non raggiungevano il target finale: l’Elettore medio.

* * * * * * * *

Internet e social media.

Mr Trump ha puntato invece sui social-media, ed ha vinto le elezioni. Ha parlato direttamente alla grande maggioranza degli Elettori.

In primo luogo, i social-media sono inespensivi. Sono ad utilizzo gratuito, sia quando si lascino commenti, sia quando si allestiscano pagine e/o gruppi.

In secondo luogo, i social-media contengono di norma post di poche righe, scritti quasi invariabilmente in linguaggio corrente, senza tante perifrasi: il loro contenuto è accessibile anche a quanti abbiano una cultura elementare. Un concetto per volta.

In terzo luogo, la diffusione.

«Facebook has over 1.5 billion active users worldwide».

Negli Stati Uniti tre persone su quattro seguono i social-media.

I twitter del Presidente Trump hanno in media 40 milioni di lettori, con picchi di 60. Una enormità rispetto ai giornali.

In campagna elettorale i lettori avevano passato i 120 milioni.

«Other opinions and related information get filtered out – a consequence of Facebook’s increasing function as the primary source of information on current events for many of its users. So they have little chance of forming well-rounded opinions»

* * * * * * * *

Il problema delle élite liberals.

«Fake news, propaganda and “disinformatzya” are changing the media landscape – in the US, Russia and Turkey and across the world. The question is how to combat them»

Chiariamo immediatamente un termine lessicologico.

Per i liberals democratici americani e per i socialisti ideologici europei si ascrivono alla “fake news”, “propaganda” ovvero “disinformatzya” ogni qualsiasi cosa contrasti la loro Weltanschauung.

Hanno quindi elaborato il concetto di “filter bubble”.

Comme d’habitude, porgono il tutto come un enorme favore fatto agli utenti dei social-media, e perché sono davvero buoni e generosi, anche agli utenti di Google.

Un software apposito analizza e memorizza tutte le scelte fatte, ossia siti visitati. Si tiene in altre parole una traccia completa e perpetua di tutto ciò che una persona abbia letto o guardato.

Il software quindi propone all’utente un insieme di scelte calibrate sul pregresso.

Sembrerebbe cosa da Santa Teresa di Calcutta, ma è in effetti una mela duplicemente avvelenata.

– Tutti sono tracciati. Questo non sembrerebbe corrispondere al comune concetto di privacy.

– Gradualmente il lettore resta intrappolato entro il ristretto circuito di siti o di pagine che visita. Gli resta così difficile poter valutare opinioni differenti. Alla fine, la sua mente resta condizionata.

* * * * * * * *

Sinceramente, sembrerebbe essere lecito il domandarsi se realmente il progresso tecnologico corrisponda anche ad un progresso umano.


Deutsche Welle. 2017-02-07. Facebook’s filter bubble. [Video]

Facebook has over 1.5 billion active users worldwide. For many of them, the social network is their primary source of information – which can result in a limited selection that only reinforces pre-existing views.

*

Most Facebook users tend to network with like-minded people. Now experts are warning this could result in what they call a filter bubble – a limiting of content to only what reinforces the user’s own pre-existing views. Other opinions and related information get filtered out – a consequence of Facebook’s increasing function as the primary source of information on current events for many of its users. So they have little chance of forming well-rounded opinions.


Deutsche Welle. 2017-02-07. Fake news is a red herring.

Fake news, propaganda and “disinformatzya” are changing the media landscape – in the US, Russia and Turkey and across the world. The question is how to combat them.

*

Watching the 2016 US presidential election was already a surreal experience, as dozens of qualified candidates lost out to a failed businessman and reality television star. But the strangeness of the election was complicated by news stories that seemed just plausible enough to be true: apapal endorsement of Donald Trump, the fiery suicide of an FBI agent investigating Hillary Clinton’s emails, Black Lives Matter as an attempt to create a race war in the US.

As you likely know, these stories aren’t true, though they did circulate widely on Facebook and other social media sites. “Fake news” and its detrimental effects on democracy has become a major theme in contemporary politics. Faced with questioning from CNN reporter Jim Acosta during his first press conference in six months, President-elect Donald Trump refused to take Acosta’s question, declaring, “You are fake news.”

Trump’s evasion referenced his anger at CNN for reporting on an intelligence dossier that suggests Russian authorities have been compiling compromising information on Trump in the hope of blackmailing him. CNN did not reproduce the dossier (online news outlet Buzzfeed did), but the president-elect was incensed that CNN would call attention to the story based on unverified documents. 

Different types of fake news

It’s tempting to say that Trump is using “fake news” to mean “news I don’t like”, but the reality is more complicated. “Fake news,” in this usage, means “real issues that don’t deserve as much attention as they’re receiving.” This form of fake news was likely an important factor in the 2016 campaign. There’s a compelling argument that the release of Clinton and Podesta’s emails by Russian hackers – and the media firestorm that ensued – were key to the outcome of the US election. While media outlets overfocused on the non-scandal of the emails, this wasn’t “fake news” so much as it was “false balance,” with newspapers playing up a Clinton “scandal” to counterbalance an endless sequence of Trump scandals.

There’s another type of “fake news” that surfaces during virtually every political campaign: propaganda. Propaganda is weaponized speech that mixes truthful, deceptive and false speech, and is designed explicitly to strengthen one side and weaken the other. Propaganda has been around for a long time, preceding the era of mass media. (Some scholars argue that the inscriptions on ancient Roman coinage should be understood as propaganda, designed to strengthen an emperor’s rule over a massive territory.) Propaganda may be an inevitable feature of electoral contests, and vicious propaganda campaigns, such as the “swiftboating” of Senator John Kerry, proved effective even before the age of social media. But tools such as Twitter and Facebook may make propaganda harder to detect and debunk. Many citizens are skeptical of claims made by politicians and parties, but are less apt to question news shared by their friends. On a medium like Facebook which gives primacy to information shared by friends, political propaganda spreads rapidly, reaching a reader from all sides, and can be difficult to distinguish from fact-based news.

A third category of “fake news,” relatively new to the scene in most countries, is disinformatzya. This is news that’s not trying to persuade you that Trump is good and Hillary bad (or vice versa). Instead, it’s trying to pollute the news ecosystem, to make it difficult or impossible to trust anything. This is a fairly common tactic in Russian politics and it’s been raised to an art form in Turkey by President Tayyip Erdogan, who uses it to discredit the internet, and Twitter in particular. Disinformatyza helps reduce trust in institutions of all sorts, leading people either to disengage with politics as a whole or to put their trust in strong leaders who promise to rise above the sound and fury. The embrace of “fake news” by the right wing in America as a way of discrediting the “mainstream media” can be understood as disinformatzya designed to reduce credibility of these institutions – with all the errors news organizations have made, why believe anything they say?

One of the best known forms of disinformatya is “shitposting,” the technique of flooding online fora with abusive content, not to persuade readers, but to frustrate anyone trying to have a reasonable discussion of politics on the internet. Disinformatzya may also explain some of the strangest phenomena of the election season, including Pizzagate, the bizarre conspiracy that led a man to “investigate” a pizza parlor with an assault rifle out of the belief – expounded and developed in thousands of online posts – that John Podesta and Hillary Clinton were trafficking children out of the basement. 

No simple answers

What can we do about news so toxic that it moves people to take up arms to investigate conspiracies? Unfortunately, the simple answers are inadequate, and some are downright counterproductive. Instead, any successful approach to fake news demands that we treat these three different diseases with different techniques.

Unbalanced news is a pre-digital problem that’s become worse in the digital age. News organizations would overfocuse election coverage on the horse race and underfocus on policy issues well before the internet. Add in an explosion of ad-driven news sites and the ability to choose what we pay attention to and you’ve got a recipe for echo chambers. Mix in algorithmic filtering, where social media platforms try to deliver us the information we most want to see, you’ve got filter bubbles. Scholarship on echo chambers and filter bubbles suggests that people who are informationally isolated become more partisan and less able to compromise, suggesting a rough road ahead for deliberative democracy.

Solving the problem of sensationalistic, click-driven journalism likely requires a new business model for news that focuses on its civic importance above profitability. In many European nations, public broadcasters provide at least a partial solution to this problem – in the US, a strong cultural suspicion of government involvement with news complicates this solution. A more promising path may be to address issues of filtering and curation. Getting Facebook to acknowledge that it’s a publisher, not a neutral platform for sharing content, and that its algorithmic decisions have an impact would be a first step towards letting users choose how ideologically isolated or exposed they want to be. Building public interest news aggregators that show us multiple points of view is a promising direction as well. Unbalanced news is a problem that’s always been with us, dealt with historically by shaping and adhering to journalistic standards – it’s now an open question whether social media platforms will take on that responsibility.

Fighting propaganda and disinformatzya

Fighting propaganda, particularly fact-free propaganda, is a tougher challenge. Many people find it infuriating to see Trump repeatedly claim that he won a landslide victory in the Electoral College when his win was one of the narrowest in history. Unfortunately, conventional fact checking does not counter propaganda very well – counter a claim and people remember the original claim, not the debunking of it. Even with debunking, the original claim remains on the internet, where motivated reasoning helps us select the claims that are consonant with our values, not with truth. 

There are two answers most often proposed for this problem and both are bad. While it seems logical to ask platforms such as Facebook to filter out fake news, it’s dangerous to give them the power to decide what speech is and is not acceptable. Furthermore, Facebook is already trying to solve the problem by asking users to flag fake news, a technique unlikely to work well, as researcher Robyn Caplan points out, because users are really bad at determining what news is fake. So perhaps the solution is to teach media literacy, so that readers become savvier about identifying and debunking propaganda. Unless of course, as social media scholar danah boyd suggests, media literacy is part of what’s gotten us into this mess. By teaching students to read news critically and search for stories from multiple sources, we may have turned them away from largely credible resources and towards whatever Google search results best fit their preconceptions of the world. 

Surprisingly, our best bets for fighting propaganda may come from a return to the past. Stanford historian Fred Turner wrote a brilliant book, “The Democratic Surround,” on how US intellectuals had tried to fight fascist propaganda in the 1940s through reinforcing democratic and pluralistic values. Rather than emphasizing critical reading or debate, the thinkers Turner documents designed massive museum installations intended to force Americans to wrestle with the plurality and diversity of their nation and the world. While exhibits such as “The Family of Man” might be an impossibly dated way to combat fake news, the idea of forcing people to confront a wider world than the one they’re used to wrestling with goes precisely to the root of the problems that enable fake news.

Even scarier than unbalanced news and propaganda is disinformatzya, for the simple reason that no one is really sure how it works. In an essay called “Hacking the Attention Economy,” Boyd suggests that the masters of disinformatzya are the denizens of online communities like 8chan and reddit, where manufacturing viral content is a form of play that’s been recently harnessed to larger political agendas. Understanding whether a phenomenon like Pizzagate is simply a strange moment in a strange election, or a masterful piece of disinformatzya designed to reduce confidence in media and other institutions, is a topic that demands both aggressive reporting and scholarly study. At this point, the task of understanding this breed of fake news has barely registered on the radar of journalists or scholars.

Fake news is a satisfying bogeyman

Harvard scholar Judith Donath suggests that combating any sort of fake news requires an understanding of why it spreads. She sees these stories as a marker of group identity: “When a story that a community believes is proved fake by outsiders, belief in it becomes an article of faith, a litmus test of one’s adherence to that community’s idiosyncratic worldview.” Once we understand these stories less as claims of truth and more as badges of affiliation, attacking them head on no longer seems as savvy. If these stories are meant less to persuade outsiders, and more to allow insiders to show their allegiance to a point of view, combating their spread as if they were infections no longer seems like a valid strategy.

I suspect that both the left and the right are overfocusing on fake news. Preliminary analysis conducted by the Media Cloud team at MIT and Harvard suggests that while fake news stories spread during the 2016 US election, they were hardly the most influential media in the dialog. In tracking 1.4 million news stories shared on Facebook from over 10,000 news sites, the most influential fake news site we found ranked 163rd in our list of most shared sources. Yes, fake news happens, but its impact and visibility comes mostly from mainstream news reporting about fake news.

Fake news is a satisfying bogeyman for people of all political persuasions, as it suggests that people disagree with us because they’ve been spoon-fed the wrong set of facts. If only we could get people to read the truth and see reality as we see it, we could have consensus and move forward! 

The truly disturbing truth is that fake news isn’t the cause of our contemporary political dysfunction. More troublingly, we live in a world where people disagree deeply and fundamentally about how to understand it, even when we share the same set of facts. Solving the problems of fake news make that world slightly easier to navigate, but they don’t scratch the surface of the deeper problems of finding common ground with people with whom we disagree.


Deutsche Welle. 2017-02-07. What goes on in a far-right Facebook filter bubble?

People tend to surround themselves with like-minded people – filter bubbles have taken that to a new level. Two German reporters were shocked when they entered the world of the far-right on Facebook via a fake account.

*

What goes on in far-right filter bubbles on Facebook?

To find out first-hand, two TV reporters for Germany’s ZDF broadcaster created a fake account – 33-year-old “Hans Mayer,” a proud German patriot with a clear penchant for right-wing topics. They encountered a world of closed groups, hatred, lies and agitation.

“Mayer,” the reporters quickly learned, was surrounded by many like-minded men and women in a filter bubble that had little to do with reality and where objections never stood a chance. A filter bubble results from a personalized search and a website’s algorithm selecting information a user might want to see, withholding information that disagrees with his or her viewpoints.

Virtual expedition

These filter bubbles are a “great threat to democracy,” ZDF reporter Florian Neuhann says. He and his colleague David Gebhard had an idea of what went on in far-right filter bubbles, Neuhann told DW, but were “totally taken aback by the speed at which their fake account accumulated Facebook friends and the utter absurdity of the stories being spread.”

People in filter bubbles focus their hatred on the same person or phenomenon – like Chancellor Angela Merkel or refugees – and they whip each other into a frenzy to outdo one another with abuse, explains Wolfgang Schweiger, a communication scientist at Hohenheim University.

On day three of the experiment, “Hans Mayer’s” timeline brimmed with fake news and lurid headlines: stories about government plans to take away the children of right-wing critics, a report stating that the city of Cologne canceled its carnival celebrations for fear of refugees, fake Merkel quotes – all shared thousands of times. The reports often followed a pattern, with an actual source hidden somewhere in the story that had dealt with the issue on hand, however remotely.

Worldwide, populists benefit from such activities; their supporters rarely challenge the “facts” they are presented.

Alarming radicalization

Humans, Schweiger says, tend to believe information passed on by those who hold the same or similar views they do.

A week into the experiment, “Mayer” had many friends on Facebook and was invited into closed groups where users openly urged resisting the system. Forget inhibitions: Interspersed between cute cat photos and pictures of weddings, posts would read “Shoot the refugees, stand them in front of a wall and take aim,” while others denied the Holocaust. No one objected.

Blind to other people’s views

By day 12, “Mayer” had 250 Facebook friends – real people who never met him in person but felt he shared their beliefs. Neuhann and Gebhard wondered what would happen if “Mayer” were to pipe up and disagree.

So they posted official statistics showing that crime rates have not risen despite the influx of hundreds of thousands of refugees into Germany. To no avail, Neuhann, says: “We were either ignored or insulted.” 

It’s a parallel world, Neuhann says. Part of the bubble is so far gone there is no way reasonable arguments can reach them, he adds, arguing that some people are only partially involved. They still have a life and maybe a job, so they might be approachable, though “perhaps not as much online as offline.”

Asked whether the reporters are afraid now that their story is out in the open, Neuhann says no, since “Hans Mayer” wasn’t the fake account’s real name.

It hasn’t been deactivated, but the journalists broke off their experiment after three weeks. The right-wing filter bubble continues to exist.

Annunci

Un pensiero riguardo “Facebook’s filter bubble. Un nuovo strumento di potere e condizionamento.

I commenti sono chiusi.