Professor Emilio Ferrara, University of Southern California, investigates the dynamics of social influence within certain group of ISIS supporters.
Recent terrorist attacks carried out on behalf of ISIS on American and European soil by lone wolf attackers or sleeper cells remind us of the importance of understanding the dynamics of radicalization mediated by social media communication channels. In this paper, we shed light on the social media activity of a group of twenty-five thousand users whose association with ISIS online radical propaganda has been manually verified. By using a computational tool known as dynamic activity-connectivity maps, based on network and temporal activity patterns, we investigate the dynamics of social influence within ISIS supporters. We finally quantify the effectiveness of ISIS propaganda by determining the adoption of extremist content in the general population and draw a parallel between radical propaganda and epidemics spreading, highlighting that information broadcasters and influential ISIS supporters generate highly-infectious cascades of information contagion. Our findings will help generate effective countermeasures to combat the group and other forms of online extremism.
Researchers in the computational social science community have recently demonstrated the importance of studying online social networks to understand our society . New powerful technologies are usually harbinger of abuse, and online platforms are no exception: social media have been shown to be systematically abused for nefarious purposes . As online social environments yield plenty of incentives and opportunities for unprecedented, even “creative” forms of misuse, single individuals as well as organizations and governments have systematically interfered with these platforms, oftentimes driven by some hidden agenda, in a variety of reported cases:
During crises, social media have been effectively used for emergency response; but fear-mongering actions have also triggered mass hysteria and panic [32,23].
Political conversation has been manipulated by means of orchestrated astroturf campaigns [50,45] even during election times [34,14].
Anti-vaccination movements [58,60], as well as conspiracy (and other antiscience) theorists [13,21], took social media by the storm and became responsible for a major health crisis in the United States .
Social media bots (non-human automated accounts) have been used to coordinate attacks to successfully manipulate the stock market, causing losses in the billions of dollars [35,23,24,62].
Some governments and non-state actors have been active on social media to spread their propaganda. In some cases, they have allegedly “polluted” these platforms with content to sway public opinion [23,22,37], or to hinder the ability of social collectives to communicate, coordinate, and mobilize .
Especially related to the last point, researchers have been recently devoting more attention to issues related to online propaganda campaigns [55,51,6]. Increasing evidence provided by numerous independent studies suggests that social media played a pivotal role in the rise in popularity of the Islamic State of Iraq and al-Sham (viz. ISIS) [28,56, 66,25,8]. Determining whether ISIS benefitted from using social media for propaganda and recruitment was central for many research endeavors [11,9,44].
Analyses by Berger and Morgan suggested that a restricted number of highly-active accounts (500-1000 users) is responsible for most of ISIS’ visibility on Twitter . However, Berger’s subsequent work suggested that ISIS’ reach (at least among English speakers) has stalled for months as of the beginning of 2016, due to more aggressive account suspension policies enacted by Twitter . Other researchers tried to unveil the roots of support for ISIS, suggesting that ISIS backers discussed Arab Spring uprisings on Twitter significantly more than users who stood against ISIS .
These early investigations all share one common methodological limitation, namely that to collect social media data they start from keywords known to be associated to ISIS [9,10,44]. This strategy has been widely adopted in a previous research aimed at characterizing social movements [30,18,63]. However, we argue that it is not sufficient to focus on keyword-based online chatter to pinpoint to relevant actors of radical conversation.
In fact, our recent results  suggest that radical propaganda revolves around four independent types of messanging: (i) theological and religious topics; (ii) violence; (iii) sectarian discussion; and, (iv) influential actors and events. Here is a series of examples of possible biases introduced by the keyword-centric approach:
• Some studies  focused on religion-based keyword lists, but most terms typically associated to religion are not necessarily used in the context of extremism.
• Other studies [9,10] focused on influential actors or events; this can introduce biases due to the focus on popular actors rather than the overall conversation.
• Further noise can be introduced by tweets that simply link to news articles reporting on events; although these tweets may contain keywords in the predefined watchlist, they clearly do not represent extremist propaganda efforts. • Finally, focusing on pre-determined keywords could cause incomplete data collection by missing topics of discussion that can emerge dynamically and do not adopt any of the predefined key terms.
In this work, we will leverage an alternative data collection and curation approach: we will start from a large set of Twitter users that are known to be associated to or symphatizers of ISIS. We will then collect their activity over a large time span of over one year to obtain a complete characterization of their extremist propaganda efforts.
Keep reading and access the full article here.
Emilio Ferrara, USC Information Sciences Institute, 4676 Admiralty Way #1001, Marina del Rey, CA (USA) 90292; E-mail: email@example.com