This post attempts to address very specific questions on the black market of automated engagement on Instagram, giving special attention to applications created to support the practice of botting, and also the functioning of botted accounts. I have not spent two years botting on Instagram, but I am certain that I have discovered enough in a couple of weeks.

My “bot” journey started with the study of hashtag engagement on Instagram. I was exploring the workers and the conservative protests in Brazil, March 2017. A list of hashtags was defined based on key terms adopted by Brazilians, protests’ organizers and political parties [1]. Initially, far-right accounts popped up while doing visual network analysis, in the most active users’ list [2], and, subsequently, when scrutinising the correlated tags of the conservative protests [3]. As a result, I started questioning myself what role these automated beings could play on Instagram engagement? And going further, how botted accounts can impact on the Brazilian General Elections in 2018? The questions do not end there. How do we know whether Instagram accounts, and consequently their activity, are generated by humans or bots? Is it even important to draw this distinction? How do social media methods work if we take into account the agency of political insta bots?

In an attempt to answer these questions, I will first map and briefly analyse the network of “Instagram bots”‘s similar apps. The second step is to report the practical functioning of these apps; understanding the logic of getting “coins” in exchange for engagement (e.g. boosting likes and comments or getting more followers). Following this reasoning, I will present some possibilities or strategies to detect botted accounts, and also expose their particular characteristics.

Key Findings:

˚ ˚ ˚ “Instagram Bots” Similar Apps Network: seven clusters were detected, but only three fit the group of similar apps that might support botting: Instagram Followers Insights and Analytics; Boost Followers or Likes on Instagram; and Get Instagram Followers Turkish Apps. This latter shows the existence of a productive market of application programs in Turkey.

–> A short description of the clusters: on the one hand, these clusters are composed by 225 applications that might support botting on Instagram (under the logic of earning coins in exchange for engagement, and the adoption of pre-defined lists of hashtags). These apps work to boost followers and likes or to track users’ activities in order to provide analytics and insights, e.g. who viewed your profile or who is not following you back. On the other hand, clusters represented by funny applications that play tricks on people, e.g. simulating a virus attack, apps that fake chats, edit photos or videos, and clean up mobile devices.

 –> Apps: Developers and Popularity: an interesting finding is the presence of Turkish developers (seven in a total) – these are distinguished by high rating and number of downloads, and they are responsible for the creation of  “get followers” app type.

˚ ˚ ˚ The black market of social media engagement  is perceived through the following building blocks: [apps description] [the modes of engagement] [the logic of earning coins] [lists of hashtags] and [automated beings in action]. The general key findings: 1. there is an indication that apps generated to boost Instagram engagement do not abide by Instagram’s Terms of Use, neither the Platform Policy (by using automation to act on behalf of the user and by charging services with basis on data generated through Instagram APIs); 2. social media engagement is lumped together with automated actors; script, bots and automated devices also play a key role in participatory processes; 3. the problem with the logic of earning coins is the illusion of popularity, which also provokes what I call “disposable engagement” that affects not only the public debate, but the apps ecosystem; 4. “Hashtags are the fuel behind every effective bot, and as a result, they’re the biggest beneficiary of bot activity” (Wilson, 2017, Apr 6); 5. bots are very fast in delivering engagement; bots create unbalanced distributions;  and bots are capable to attract real engagement, but they will certainly draw more attention from other bots.

˚ ˚ ˚ Tracing botted accounts:  after analysing 255 possible botted accounts, the key findings were: 1. more than half of botted accounts that have interacted with my Instagram profile were created in 2017 and 2016; 2. the existence of botted accounts with private profiles; 3. in terms of language, English and Russian stand out, however almost half of the profiles do not indicate their language; 4. there are very few reoccurrences of botted accounts (only nine have appeared twice on my Instagram profile and publications); 5. the name variation is a peculiar characteristic of botted accounts – dashes or dots are added to a profile name with the purpose to create more fake profiles (e.g. 89nur89, _89nur89_, _89_nur_89_, 89_nur_89).

Key words: Instagram, similar apps network, developers, bot activity, botted accounts, disposable engagement.


Mapping “Instagram Bots” Similar Apps on Google Play Store

A good way to understand the botting activity on Instagram is to look at application programs, especially those that can operate on behalf of the user (e.g. by liking, commenting or following other accounts). With this intention, the first step was to query Android apps on Google Play Store using the term “instagram bots”, resulting in a total of 49 apps. After that, and relying on apps URLs, I extracted the apps’ unique identifiers (e.g., in the following URL unique ID is in bold https://play.google.com/store/apps/details?id=com.instfollowers&hl=pt_PT) [4].  The list of 49 apps unique IDs was launched in Google Play Similar Apps; the tool scraped details and ‘Similar’ apps on November 16, 2017. In Gephi, I opted for applying two layout algorithms (Force Atlas 2 and Fruchterman Reingold), and the community detection algorithm: modularity. The following analysis is a result of this process.

3. insta bots co-related apps_ratingNOLABELS.svg OK

“Instagram Bots” similar apps network is composed by 453 applications and seven clusters. Among these, I considered the Pink, Green and Light Blue clusters as the most important ones, because these are composed by apps that might support bot activity.

A short description of the CLUSTERS ———————————————————————–>>

PINK —> Instagram Followers Insights and Analytics [105 apps / rating: min 2.7 – max. 4.9 / The majority of apps was published in 2017; only seven apps were launched in 2016, and one in 2015*]. These apps’ main objectives are to find users who are not following you back; to view who unfollows you or track new followers; to detect who has blocked your profile, fans (people you don’t follow back), or inactive users (in order to unfollow them). Some apps also allow one to unfollow all people who do not follow back, whereas others enlarge the profile picture – which is usually compressed and pixelated. Adding to that, a long list of applications track and reveal usernames who viewed one’s profile.

*2015: Hacking Tutorials 2.0 /
2016: Spy Gadgets Kit, Password Finder recovery, Unfollower - Follower Cleaner, FollowMeter for Instagram, Qeek - Enlarge Profile Picture, Boomerang van Instagram, and Unfollow All for Instagram

GREEN —> Boost Followers or Likes on Instagram [92 apps / rating: min 3.2 – max 4.8 / All apps were published in 2017, except Hashtags For Likes.co – released on December 18, 2016, and TagsForLikes Pro – launched in August 2013]. The applications in this cluster have a common goal: to boost followers or likes. They work under the logic of earning coins, in order to exchange these coins for likes, followers or comments, and the adoption of pre-defined lists of hashtags for get more likes or followers.

ORANGE—> Special Features for Messages and Video Chats [90 apps / rating: min 2.8 – max 4.8 / The majority of apps were published in 2017, except FunForMobile Ringtones & ChatWhatsBubbles – Chat BubblesYazzy (Nep Gesprekken), and Yazzy Simulator (Fake chat) – released on December 2016]. This cluster gathers features and special functions for messages and video chats. For messages, a set of apps promote the creation of GIFs, audio clips, emojis and wallpapers for chats. Other characteristics cover mechanisms to fake conversations (e.g. InstaFake), to protect chats from screenshots and web platforms or automatically delete messages after being read. Some of these applications allow users to read messages in incognito mode or the possibility of reading messages from different applications, and also reply them without quitting the current app. When relating to video chat apps the options can vary from live video chat to group video talk, from sync up users’ online activities allowing them to watch videos together  to Santa Claus video call live. Apps can also offer live video chat for strangers – e.g. Holla.

LIGHT BLUE—> Get Instagram Followers Turkish Apps [55 apps / rating: min 2.6 – max. 4.9 / The majority of these apps was published in 2017]. This cluster is constituted by Turkish apps with a focus on “get followers”, for instance: instabayim takipçilerSüper Takipçi KazanTakipçi ve Takip Etmeyenler – İYİTAKİP, and Begeni Takipçi Kazanma Analizi.

MOSS GREEN—> Photo editing Apps [50 apps / rating: min 3.4 – max. 4.7 / All apps were published in 2017, except LG Pocket Photo in 2016]. This is a photo editing cluster which is not a surprise at all, since Instagram is a visual content platform. Some apps examples within this cluster: Baby Costumes Photo EditorInstaSize-Photo EditorFunny Photo Editor Video,  Men Fashion Photo EditorSuper Hero Photo SuitMonster Land – Zombie Video, GIF, Photo EditorPhoto Editor Collage MakerVSCOBikni Girls photo editorPerfect Me – Body Shape Editor.

BLUE—> Pranks [31 apps / rating: min. 4.1 – max 4.6 / All apps were created in 2017]. All sorts of prank apps can be found within this cluster, for instance perfect followers prankprank callsvirus pranksuper anti mosquito prankanti mosquito prankBlood pressure and sugar test prankbroken screen prankcockroach run on screen prankcat in phone prankghost prank. [I found very entertaining to discover such apps]

RED —> Optimizer apps: boost and clean up mobiles devices [30 apps / rating: min. 2.9 – max. 4.8. The overwhelming majority of apps was published in 2017. This cluster is mainly about apps that boost and clean up mobile devices through offering services such as cleaning memory and junk file, or extending battery life.

Apps: Developers and Popularity ——————————————————————————->>

The short description of clusters leads us to raise questions about apps developers and popularity: who are the developers who support the botting activity on Instagram? How popular are these applications in terms of rating and number of downloads? If we take into consideration the whole network, BlackBerry Limited, Viber Media, Skype, imo.im, Tango, Lyrebird Studio, Instagram, and Kik Interactive represent the main developers, according to rating count and number of downloads. However, I am not particularly interested in these actors, as they represent free calls, messages or video chats and photo-editing apps. Since the focus is on tracing the developers who might support the activity of botting on Instagram, I decided to look carefully at those parts (instead of the whole). In this way, I paid especial attention to the PINK —> Instagram Followers Insights and Analytics, GREEN —> Boost Followers or Likes on Instagram, and LIGHT BLUE—> Get Instagram Followers Turkish Apps clusters.

4.instabots_developers by clusters-rating(size)-n.Downloads(colour) OK

The treemap (above) gathers the main developers who might support bot activity on Instagram. Here you can see a subdivision of developers, classified by apps rating count (size) and the number of downloads (colour). After verifying the types of applications created by each developer, three general types of apps were detected: i) get followers; ii) track secret admirers or what your followers are doing and seeing; and, iii) use hashtags to increase likes and followers. Not a surprising result.

Here, the notion of popularity can be measured by the apps’ rating count and the number of downloads. Having said that, in terms of rating count (size), the most popular developers (BeakerApps, Prilaga.com, Thundred, and Takipci Uygulamasi) build applications that promise more followers or suggest lists of tags to increase engagement on Instagram. In terms number of downloads (colour), the apps created by BeakerApps, Bendak LLC, Yapp!, and Social Edge also mirror features like enlarging Instagram profile picture, who viewed your Instagram profile, and get more likes.

In the context of high ratings together with the number of downloads, an interesting finding is the presence of seven Turkish developers – Takipci Uygulamasi, insMobil – Takipasi Kazan, Karael Media, Lafyu Inc., Baha Gökce, Sosyal Medya Marketi, SMARK, that are all represented by getting followers-based applications. Altogether they have at least gathered a total of 770,000 apps downloads. In the app’s description, you can find phrases such as “here is the right place to be popular”; “with an average of 500 credits per day, you can win both followers and their admiration”; and even “you can buy followers”. This latter is posed by the fictional character Cafer Abi, dubbed as an imaginary hero created to help people in the world of social media.

I cannot affirm that these developers actually support botting activity or use Instagram data as currency to sell (fake) engagement; further investigation should be done. However, and considering the general mechanisms of these applications, these developers are very much prone to fraud.

instabots_app category vs. n.Downloads OK

Let’s leave behind this particular perspective and move to a broader scope. You can see above the categories of “Instagram Bots” similar apps network, according to created time and number of downloads. The first thing that stands out is that most applications were created after October 2016. We all know that Instagram was launched in 2010, and acquired by Facebook in 2012. This rapid growth in applications usage culture is a reflection of the platform society in which apps and web platforms are becoming part of everyone’s daily routine. “A society which social, economic and interpersonal traffic is largely channelled by an (overwhelmingly corporate) global online platform ecosystem that is driven by algorithms and fuelled by data.” (van Dijck, 2016)

A second aspect is seeing that the categories of Communication, Entertainment, Photography, Social, and Tools are the ones which accumulate more downloads. The categories of entertainment and photography can be easily associated with Pranks and Photo editing Apps’ clusters. Meanwhile, social and tools relate with the Pink, Green and Light Blue clusters (composed by apps practically exclusive to the Instagram platform).


The black market of social media engagement 

Based on the experiments and observations that compose this post, let me introduce you to what I understand as being the building blocks for the black market of social media engagement, drawing the activity of botting on Instagram as a starting point.

[apps description] There is a long list of affordances in the app’s description: from finding users who are not following you back to fans detection; from boosting engagement to easily accessing pre-defined lists of hashtags, and further on. The logic is to control and monitor your ‘followers’, ‘fans’ and bots activities, and in parallel to enable practices and mechanisms that improve performance and engagement. But the problem here is that these apps descriptions typically do not indicate what they are really capable of in order to act on behalf of the user and to charge services based on data generated through Instagram APIs. See the example below [5].

[TagsPortion description versus its real functionning]

It is no secret that apps charge users with small and large sums of money in exchange for a growing number of likes and followers. In this respect, can we assume that the applications previously detected are selling Instagram data? The fact is I could not verify them one-by-one, but I can speak of those I have tested, which are apps that do not abide to Instagram’s Terms of Use (“You must not create accounts with the Service through an authorised means, including but not limited to, by using an automated device, script, bot, spider, crawler or scraper”), neither the Platform Policy (“Don’t sell, lease, or sublicense the Instagram APIs or any data derived through the APIs”). In this regard, complete information will probably never be displayed in the app’s description, at least not for those that cheat the system.

[the modes of engagement] We usually assume social media engagement as the lynchpin of participatory democracy (or participatory processes), and we value “the act of engaging or being engaged”* as a representation of human activities that can be counted or measured. But the fact is that social media engagement is lumped together with automated actors: script, bots and automated devices also play a key role in participatory processes. This means that, when addressing engagement research, we must be aware that either people or automated beings are part of the whole experience; both “participate or become involved in”, both “establish a meaningful contact or connection with”**. Having said that, and with the intention to understand the modes of engagement on Instagram under the perspective of botting, I have tested some applications available in the Apple Store (see in ‘automated beings in action’).

[the logic of earning coins] The currency of engagement on Instagram is usually called «coins» or «stars» or «credits»: one can exchange coins for likes, comments, followers, views and even for Instagram accounts (with tons of followers). In practice, coins are exchanged for both automated and human beings engagement. My hypothesis is that automated beings created for political purposes are growing faster than we can actually expect. How can users get coins? By putting likes under the photos of other users, by rating the app with five stars or sharing the app across platforms, by watching ad videos or buying boosted accounts (probably ghost ones), by enabling automated likes (allowing the application to make loads of likes on the behalf of the user), or by actually purchasing coins, which means paying with real money.

How to get coins?
How to get coins? Apps: Get Followers & Likes Report for Instagram (left side) and Tags Potion (right side) – App Store, Apple. Screenshots: November 2017.

The easiest way to earn coins is to enable auto like  (simple and super fast!): one can make around 300 likes in up to 3 minutes. That is the limit imposed by the platform, and bots do not cross Instagram’s spam thresholds. “You have reached the hourly following or like limit. Please wait a few minutes to continue”: this is the message I have received from Likes Report app when reaching my hourly limit.

You boost whatever you wish to, but also you have your account automatically following unknown people or liking hundreds of visual stuff that might not be part of your personal taste. Adding to that, a significant number of these actions is programmed to dislike and to unfollow; making the ‘part of what you get’ disappear. Another issue is the promotion of applications in exchange for coins, e.g. rating five stars on Google Play or Apple Stores or sharing the app across different platforms. This starts a vicious circle, first the app promotion, in second place the like reward, then one more lap, and again and again.

Apps_images 1
Shopping automated engagement and boosted accounts on Instagram. App: Get Followers & Likes Report for Instagram – App Store, Apple. Screenshots: November 2017.

Everything works under similar functions and options; it is a sort of game in which it does not matter whether what you get is real or fake. The point is to foster more likes, more followers, more views. In this game, you cannot control your own activities on the Instagram platform, but you can monitor those connected with your account and you can order fake interactions in the hope of becoming popular, and hopefully reach real people.

The problem with the logic of earning coins is the illusion of popularity; it also provokes what I call disposable engagement. On the one side, the rapid growth of engagement on a particular Instagram account attracts public attention; on the other, to pay money in order to boost engagement can have a serious impact on society, for instance in the voting in the next Brazilian presidential elections. Besides, the practice of generating disposable engagement affects not only public debate and civic engagement but also the apps ecosystem.

[lists of hashtags] Applications not only suggest but organize lists of good hashtags according to specific categories as a promise for more likes and followers (see below). “Hashtags are the fuel behind every effective bot, and as a result, they’re the biggest beneficiary of bot activity. Even if you aren’t using bots, hashtagging your work will get you more passive engagement as bots around the world target your content” (Calder Wilson, 2017, Apr 6). But the adoption of hashtags is not exclusive to bots action; because hashtags are also a representation of human claims, ideals, opinions, and so on. Under this logic, we need to broaden our perspective while doing digital research in practice. Better said, there is a necessity to learn how to read these practices of hashtag adoption.

[The fuel of effective botting activity: lists of hashtags]

[automated beings in action] I tested some applications available in the Apple Store (Tags Potion, Profile Report, Get Followers & Likes Report for Instagram – Likes Report, and Followers Pro For Instagram), exploring all possibilities offered by these apps to buy engagement (the free and the paid ones). I can either say that I boosted my Instagram account or that I bought fake followers, likes and comments, right? (pick whichever option you like!) After exchanging my coins for the action of automated beings, the results were immediately visible: bots are very fast in delivering engagement. In a matter of minutes (to be more precise: two), I had more likes than I have requested – which was a total of 40 (see below). However, the precision of delivering likes and followers is not comparable to comments. See below some bots commenting, and making a comment on my behalf: “Love this look 👑🌸👑”. It seems bots are not so clever as they were expected to be (see more examples of bots commenting here: post 1, post 2, post 3).

The immediate results of Insta Bots – exchanging coins for likes, followers and comments.

Since I gave permission for TagsPotion to act on my behalf, I had to monitor my own activities in order to see who “I” have started to follow or what sort of comments “I” have made. After buying 30 followers, the application TagsPotion started following 51 Instagram profiles on my behalf, and I have gained 24 followers – less than I actually ordered. The very next day, the total number of followers displayed on my account had decreased, which was unlikely given the number following (see above). In this perspective, bots create an unbalanced distribution: substantial numbers of following against not substantial numbers of new followers, moreover, the loss of likes from actions programmed to first like, then dislike. Adding to that, bots can also attract “real” engagement: people who will be tempted to return the favour and like-comment-follow you back. However, bots will certainly draw more attention from other bots.

[Automated beings are capable of liking 8 posts in only two seconds]

Another very specific characteristic of automated beings on Instagram is the speed of action; eight posts can be liked in the twinkle of an eye (see above). One may try to do the same, but such ability can only be achieved by bots.

So far, and not excluding other possibilities, I understand that these building blocks are constitutive parts of the black market of Instagram engagement. What follows is an attempt to address the problem of detecting botted accounts.


Tracing botted accounts

The activity of botting not only depends on applications, but it also demands an army of fake profiles. These, are usually new on Instagram but with outrageous numbers of users following or followers, all with very few publications. Despite general assumptions, I would say that the first step to detect botted accounts is to allow your own account to be botted (or just create a new one). There are no established methods to detect bots on Instagram, however, some particular characteristics should be taken into account. For instance, to verify the accounts name (e.g.flyhighandlookdown, xo_kasssi_xo, bkusklubnikaa, dlod5555) and their creation time, crossing this information with the user profile information (e.g. private or public profile; the number of posts, followers, following; language). In addition, comparing the number of publications with the total of followers or following, and checking the followers of the bots profiles (because of bots like and follow other bots).

Another very good strategy to detect botted accounts on Instagram is to trace their network, for instance by using Instagram Network, a DMI Tool that allows researchers to get the follow or follower network from a set of Instagram users. However, and as a result of the problem of social media APIs limiting access, this tool is no longer an option for scholarly research. The unavailability of data points limit perceptions and analysis of particular grammars; in this case, the follow and follower network of botted accounts.

I opted for tracking the possible botted accounts that either have liked, commented or started following my Instagram profile. 225 possible botted accounts were verified between December 14th and 16th, 2017. To build this dataset, I followed three steps: i) gathering screenshots of bot accounts liking my posts or following my profile, added to all accounts that have commented on my posts, ii) after that, individually checking these Instagram profiles to collect basic info (type of account, i.e, public or private; number of posts, followers, following; and language); and, iii) in parallel verifying the creation time of the accounts with Instagram account age checker.

Starting with the verification of accounts age, profile mode, and language (see below), more than half of the botted accounts that have interacted with my Instagram profile were created in 2017 and 2016. To my surprise, private profiles were also identified as possible botted accounts (38 in total), whereas some profiles no longer existed. Speaking of language, English and Russian stand out, followed by Portuguese, French and Italian, although almost half of the profiles do not indicate their language (i.e., inexistent).

This slideshow requires JavaScript.

Below you see a quantitative distribution of the number of following (on the left) and posts (on the right) according to the account type (see the followers distribution here).  Note the lower and median quartiles on the right, and the upper quartile on the left. The very few posts in contrast with the high number of followers is a visual and statistical indicator of Instagram botted accounts. An interesting characteristic was perceived concerning the reoccurrence of the same botted account: in a universe of 225 possible botted accounts, only nine*** have appeared twice on my Instagram profile and publications (either by liking and commenting or following and liking, for instance).

[Tracing botted accounts on Instagram 2017: a quantitative distribution of the number of following (left) and posts (right) according to account type]

Another peculiar characteristic of botted accounts is the name variation (see below). Dashes or dots are added to a profile name with the purpose to create more fake profiles. For example, the Instagram account 89nur89 was transformed into three other accounts: _89nur89_, _89_nur_89_, and 89_nur_89. I also observed the prevalence of female profiles over male, and that botted accounts have very few comments in their posts.

[bots name variations]

See below some examples of possible private botted accounts and the screenshots of the typical profile of botted accounts.

[possible private botted accounts]

This slideshow requires JavaScript.

[The typical profile of botted accounts]


More questions and food for thought

To sum up, bot activity on Instagram basically depends on the existence of fake or real profiles and lists of hashtags combined with the agency of third-party applications. And, even if Instagram’s Terms of Use explicitly do not allow botting, in fact the platform’s written terms do not prevent bots’ existence and action [6]. It is true that instabots may not be ‘smart enough’ to trick us on what concerns comments, but they are masters in amplifying and spreading engagement (e.g., in political debate through inflating Bolsonaro’s far-right values and ideals).

Likewise, bots can jeopardise digital research (e.g. social media networks analysis) through the overwhelming adoption of very specific tags. However, we can only take the action of insta bots as a considerable threat to digital research if we overlook the technicity of social media platforms. Better said, when we exclude the mode of existence of Instagram itself, we opt for a weak/fragile form of doing digital research; but when we (at least) consider the way of being of Instagram – acknowledging the limitations and affordances of the platform – we may be aware of the agency and influence of the activity of botting.

In this experiment, I found 255 apps that might support botting on Instagram, and 225 possible botted accounts. How large this app ecosystem can be? How many fake profiles can be found on Instagram today? How many fake accounts will be created in the Brazilian Presidential race in 2018? Another question is: in the same way my account was botted in order to boost likes on my own posts or get more followers on my own profile, can fake accounts be guided to engage with pre-defined posts or to follow specific accounts?

…………………………………….˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚…………………………………….˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚

Social media platforms were once assumed as new sources of power and social mobilization due to their capacity to rapidly and effectively gather, disseminate, and denounce particular or generic issues (Tufekci, 2014; Omena and Rosa, 2015). However, The Economist asserts that after the 2016 US Presidential Elections, we can no longer expect social media to be a mirror of society and tool for more enlightened politics; “far from bringing enlightenment, social media have been spreading poison“. Have they?

“It’s scary to think that organic, authentic voices in the public debate  – more than 99% of which are in favor of keeping net neutrality – are being drowned out by a chorus of spambots” [Jeff Kao, 23 Nov 2017

“When likes and comments are so easily acquired with a bot, the worth behind these actions is diminished”. [Calder Wilson, 6 April 2017]

If social media engagement is incredibly inflated, what is the value of such engagement? Should we acknowledge or ignore those automated actions? How can we value disposable engagement? Do we need to tell which fraction of engagement is generated by organic or automated actions?

There is an urge to separate organic engagement from that generated by bots or sponsored content. However, I argue that we should take into account the whole context since we are exposed to both human and non-human actors via web platforms. Additionally, social media engagement research is generally perceived through a dual logic: “the sums of actions media items receive, and the recurrent use of natively digital objects or grammars of action from many people about a topic” (Omena et al., 2017). Again, under this rationale, human and automated voices converge.

On Instagram the logic is simple, you can buy likes that are generated by fake accounts; these profiles interact with your posts or even turn to be one of your followers. It is a fast and simple buying engagement. But this is all an illusion; an illusion of popularity that actually intervenes in the functioning of the whole platform, in what we see or come to think. This is the black market of social media engagement.

˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ Notes ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚

* The definition of engagement according to the Oxford Dictionaries.

** The definition of engage according to the Oxford Dictionaries.

*** maddijaye, iin.sane, marianavgaspar, thereal_titi17, thediamondsbank, jordan_sicaire, dianambariniratupematu, callmeakil, andreaazekovic

[1] List of tags for the Workers’ Protests (15 March 2017): general strike (#grevegeral), get out temer (#foratemer), direct elections right now (#diretasjá), and I want to retire (#queromeaposentar). Lists of tags for the Conservative Protests (26 March 2017): come to the street (#vemprarua), operation car wash (#lavajato), end to legislative immunity (#fimdoforoprivilegiado), Lula in jail (#lulanacadeia), and Free Brazil Movement (#mbl)

[2] Read the post Instagram Data Analysis.

[3] See case study “2017 protests in Brazil” (slide 26) in why look at social media apis?

[4] The methodological protocol adopted to explore “Instagram Bots” Similar Apps Network was inspired by the work of Carolin Gerlitz, Fernando N. van der Vlist, Anne Helmond, and Esther Weltevrede: “App support ecologies: An empirical investigation of app-platform relations” [Conference: Infrastructures of Publics – Publics of Infrastructures, First Annual Conference 2016 of the DFG Collaborative Research Centre 1187 ‘Media of Cooperation’, DOI: 10.13140/RG.2.2.11533.74723].

[5] TagsPotion is an app available in Apple Store, not in Google Play Store. The exploratory study conducted to understand how botted accounts work also adopted apps from Apple Store. The reason is simple: I have an iPhone.

[6] We know that in 2014 Instagram deleted 18 million fake accounts, not only celebrities lost thousands and occasionally even millions of fake followers, producers of disposable engagement, but Instagram itself lost a great number of users (The Web developer Zach Allia produced a visualization to show the number of followers Instagram’s top 100 accounts lost in a single day). In the same year, a botmaker named Darius Kazemi said to The Boston Globe that it is actually “very simple” to make a bot farm. The fact is, the black market of social media engagement can go beyond the ecosystem of applications and the creation of fake accounts ready to be botted. For instance, I did not go further with Instagram bot farms (some homework to be done), neither looked carefully at web sites such as RanticSocial Media Daily or Buzzoid; these promise legit Instagram followers, likes, comments or video views by offering special prices. .

˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚  ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚  References  ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚  ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚ ˚

Omena, J. J. and Rosa, J. M. (2015). 15 de Março: “o Brasil foi pra rua” (de novo!). Estudos dos protestos na web e redes sociais. In Carlos Camponez, Bruno Araújo et al. (editors), Comunicação e Transformações Sociais, Vol II: Comunicação Política, Comunicação Organizacional e Institucional e Cultura Visual (Atas do IX Congresso da SopCom), pp. 51-74, Coimbra: SopCom.

Omena et al. (2017). Visualising Hashtag Engagement: Imagery of Political Polarization on Instagram. Digital Methods Summer School 2017, Get the Picture. Digital Methods for Visual Research, 26 June – 7 July, University of Amsterdam.
 Policy & Internet, 6 (2): 202-208. doi:10.1002/1944-2866.POI362
van Dijck, J. 2016. “#AoIR2016: Opening Keynote ‘The Platform Society’ by José van Dijck.” Alexander von Humboldt Institut für Internet und Gesellschaft. YouTube. Nov. 2, 2016. https://www.youtube.com/watch?v=-ypiiSQTNqo.
Advertisements