About Polarization, News Dissemination and Social Media Perspectives

Let me start off with a backgrounder before I get down to some generalized observations.

The Facebook post where I made the generalized comments concerned the particular episode, referred to and linked, below. I too had come across reports of this incident. Typically, particularly for Indian news, I would usually check and skim 2-3 sites from a shortlist of what I consider reasonably credible sites, both Indian & international. While I have been reading most of the news over the internet for almost 2 decades now, news aggregator apps which have got progressively better over the years, meet my needs beautifully once I customize & tweak them to show what typically I would like to see / read.

Getting back to this item, after I skimmed a couple of sources carrying the story, I didn't know much better since there were some contradictions and inherent conflicts in the versions I perused. I saw no need to share it because a) I'm not a newscaster or some journo tasked to propagate news on SM networks and b) incidents of this kind are a dime a dozen in Indian news networks and since India has such a cavalier and almost indifferent attitude about loss of lives through accidents, calamities or planned attacks (even the news about 60 men, women & children being mowed down by an onrushing train while they were lounging on train tracks, watching an annual Ram Lila event, disappeared after about 2 -3 news cycles), they usually do not form part of the collection of topics that I am typically interested to share with my observations & insights for whatever they are worth.
8 year old madrasa student killed during brawl in south delhi

I noticed that in a highly polarized, trust-deficit & hate-filled environment that India's Modi administration has created so successfully over the last 4.5 years of its existence, this unfortunate incident quickly took on communal, religious and minority-persecution overtones on SM. Could some of it be true? Like I said, I have no idea really because while news items like the one attached don't say so expressly, some other news items contradict this version somewhat. On to the general observations below:

Some of the posts and comments that SM users may be seeing about this incident are typical of the ugly underbelly of the internet and social media networks. I have no knowledge of the episode that is alluded to, other than some media reports which seemed conflicting to me. However instant expertise, punditry, judgements and (conspiracy) theories on most topics are rampant across the internet & across SM networks and often act as the trigger for misinformation & fake news. In a polarized environment, fact checks, even if publicized widely, have pretty limited effectiveness because most people subconsciously believe what they want to believe in the first place

There is substantial research and analysis available to show that instead of making people happy, SM often has much the opposite effect of making them miserable. Also those revelling in the ripple effect of 'likes', reshares & comments don't quite realize that the many algorithms governing the news feed ensure that even a public post is just about visible to barely 10% of the followers, unless the user chooses to 'promote' the post.

In the recently launched book 'Anti-social media: How Facebook Disconnects Us and Undermines Democracy' by Professor Siva Vaidhyanathan, Director of the Centre of Media and Citizenship, University of Virginia, the author critiques the growth and evolution of Facebook and the threat it poses to us all. Select excerpts below, detail this further.

'Facebook was founded by an undergraduate with good intentions but little understanding of human nature. He thought that by creating a machine for “connecting” people he might do some good for the world while also making himself some money. He wound up creating a corporate monster that is failing spectacularly at the former but succeeding brilliantly at the latter. Facebook is undermining democracy at the same time as it is making Mark Zuckerberg richer than Croesus. And it is now clear that this monster, like Dr Frankenstein’s, is beyond its creator’s control.'

'There are, says Vaidhyanathan, “two things wrong with Facebook: how it works and how people use it”. It works by monitoring its users – hoovering up their data trails and personal information in order to paint virtual targets on their back at which advertisers (Facebook’s real customers) can take aim. People use it for all kinds of things, many of them innocuous, but some of them absolutely pernicious: disseminating hate speech that leads to ethnic cleansing in Myanmar, for example; spreading white supremacist propaganda in the US or Islamophobic or antisemitic messages in innumerable countries, and so on. People also use it to try to influence democratic elections, to threaten and harass others, to spread fake news, publish revenge porn and perform a host of other antisocial acts.'

Facebook “farms” its users for data: the more they produce – the more “user engagement” there is, in other words – the better. Consequently, there is an overriding commercial imperative to increase levels of engagement. And it turns out that some types of pernicious content are good for keeping user-engagement high: fake news and hate speech are pretty good triggers, for example. So the central problem with Facebook is its business model: the societal downsides we are experiencing are, as programmers say, a feature, not a bug.

'What to do about this corporate monster is one of the great public policy questions of our day. The company has 2.2 bn users worldwide. While it may be good (or at least enjoyable) for individuals, we now have clear evidence that it’s not that good for democracy. It has no effective competitors, so it’s a monopoly – and a global one at that. And, given its business model, it has no incentive to reform itself. So what can be done about it?'

The complete review of the book in 'The Guardian can be read by taking the jump to the link just below:

A critique of the social media giant underlines the threat it poses to us all – and suggests how it can be tamed

Given below are select excerpts from an interview Prof. Vaidhyanathan gave to 'The Washington Post':
'Facebook is in the social engineering business. It constantly tries to manipulate our experience and, thus, our perspective on our friends, issues and the world. It does so haphazardly and incoherently, it seems at first. But, in fact, there is a coherent driving force. Facebook wants to maximize something close to “happiness.” It has fallen under the sway of those who believe one can measure affective states and make changes that can increase satisfaction or joy. It turns out that Bentham’s Panopticon was not his major influence on 21st-century digital culture. It was the idea of maximizing happiness by counting “hedons,” or units of pleasure. Well, you can only dial up something you can count. You can’t really count happiness. So you count a proxy.'

'For Facebook, that proxy is “engagement,” the number of clicks, shares, “likes” and comments. If a post or a person generates a lot of these measurable actions, that post or person will be more visible in others’ News Feeds. You can already see how this could go wrong. Unsurprisingly, items advocating hatred and bigotry, conspiracy theories or wacky health misinformation generate massive reactions — both positive and negative. A false post about the danger of vaccines would generate hundreds of comments, most of them arguing with the post. But the very fact of that “engagement” would drive the post to more News Feeds. That’s why you can’t argue against the crazy. You just amplify the crazy. Such are algorithms and feedback mechanisms.'

The article can be read in full by taking the jump to the link below: