What is the most dangerous trend in Social Media

What is the most  dangerous trend in Social Media 

Hannah Murphy , Financial Times tech correspondent writes that in May this year, Facebook casually invited her to join a conspiracy cult that believes the world is controlled by a Satan-worshipping, baby-eating, deep-state coterie and can only be saved by US president Donald Trump.  The message was 

“Join groups to connect with people who share your interests,” the social media network implored in a recommendation email. Below was a suggestion that Hannah had become part of a 135,000-strong Facebook group called “QAnon News & Updates – Intel drops, breadcrumbs, & the war against the Cabal”.

The term QAnon is an outlandish far-

right conspiracy theory; in essence, an anonymous individual “Q” is drip-feeding believers “classified” information about Trump’s fight against a diabolical collective of Democrats and business elites. As QAnon has ballooned, it has taken on menacing undertones: followers, calling themselves “digital warriors”, and encouraged to take an oath to “defend” the US constitution. Last year, the FBI labelled fringe political conspiracies, QAnon included, a domestic extremist terror threat.

Earlier such  were the fringe  groups but in recent months, it has spread like cancer from the fringes of internet culture to a mainstream phenomenon – Present US President Trump himself has publicly praised the group for its support – and has become a topic of consternation for observers of the US presidential  election. That is not only a problem for Facebook and for the Society as a whole. 

What is particularly jarring is that this is history repeating itself: once again, short-sightedness from Silicon Valley has allowed extremist thinking to flourish.

In 2018, former YouTube staffer Guillaume Chaslot criticised the video site’s recommendations algorithm for pushing some users down a conspiracy-theory rabbit hole. YouTube is owned by Google and it recommend to its viewers what they should view, the recommendations generate 70 per cent of views on this video platform. They have been crafted to keep you engaged for as long as possible, allowing more opportunity to serve the advertisers. This could

mean repeatedly showing you similar content  deepening the biases you might have. These are blind spots in the business model. The company promised in 2019 to do more to downrank the biggest conspiracy theories, though critics say it is yet to convincingly solve the problem.

So what had warranted Facebook’s QAnon advances towards its users? The email  linked to user’s work Facebook page, which they use to monitor posts and live streams from Mark Zuckerberg and other Facebook executives. According to FT’s Hannah her search history, she had looked up the phrase “QAnon” several days earlier, likely triggering its recommendations algorithm.

By design, Facebook’s algorithms seem no less toxic and stubborn today than YouTube’s back then. Permitting dangerous theories to circulate is one thing, but actively contributing to their proliferation is quite another. It is my own experience that I also receive suggestions to join rightist and extreme religious groups . You also might have faced the similar situation. 

Facebook’s internal research in 2016 found that 64 per cent of new members of extremist groups had joined due to its recommendation tools. Its QAnon community grew to more than four million followers and members by August, up 34 per cent from about three million in June, according to The Guardian newspaper. 

It appears that due to media activism, Facebook has since made

moves to clamp down on QAnon. Last month, it announced plans

to remove any pages discussing violence, but this week it said it would cull QAnon groups altogether.

Still, that it was three years after the theory was born before Facebook took action is alarming, particularly since Zuckerberg has announced a shift from an open friends-focused social network towards hosting more walled-off, private interest-based groups.

There is no denying such

groups pose unique challenges. Flagging and taking down foreign terrorist groups such as Isis is a fairly unambiguous exercise. But how does one rank conspiracy theories? Can an algorithm assess where collective paranoia ends and a more violent conspiracy theory begins – and what is the appropriate response if it can?

The companies like Facebook claim on innovating and delivering the future. But they don’t seem to be able to escape their past, which dangerously affects  present of today’s generation.

With deep pockets, Facebook should have the expertise for fiercer monitoring of its public and private groups and its recommendations algorithms and a lower bar for downranking questionable conspiracy theory content – to catch, rather than help create, the next QAnon. Perhaps the time has come that the business giants of Internet themselves take initiative to take care of the unintended possible consequences of their business model which adversely affect the society. 



Comments

Popular posts from this blog

Is Kedli Mother of Idli : Tried To Find Out Answer In Indonesia

A Peep Into Life Of A Stand-up Comedian - Punit Pania

Searching Roots of Sir Elton John In Pinner ,London