Top Links
Logo
 

 

Surveillance Capitalism


Surveillance capitalism is a term that may be unfamiliar to many. I interpret the phrase as a way to describe the business model of most social media sites. These sites are free in that users do not pay a subscription fee. The sites make income to cover the costs of infrastructure, employee salaries, and to make a profit by collecting data about those who use the sites and either then target personal ads assumed to be attractive to users by using these data or by selling these data to other businesses. Social media sites explain their business models in agreements they ask users to approve before using the sites, but the length of these agreements limits the proportion of users who actually review the terms users have been willing to deem acceptable and the vagueness of the language as it relates to how users see themselves using the sites limit user understanding of how their data are being used. Recent events such as discoveries related to the Presidential election of 2016 have brought much greater scrutiny to the business model of many "free" online services. The phrase "surveillance capitalism" has been advanced as a summary to encapsulate the true nature of this business model (McNamee,2019; Zuboff, 2019).

Much of the negative connotation associated with surveillance capitalism is based on the techniques some online companies use to collect user information to improve the appeal of their ads and to increase of the value of personal information these companies can sell as a resource. These strategies involve manipulation through operant conditioning, prioritizing emotional content out of proportion to content taking a more carefully reasoned and factually based approach, and manipulating content flow in ways that change behavior. As this material is being developed mostly to inform educators, it can also be argued that these manipulation strategies are particularly effective in impacting adolescents. It is not clear that the application of these strategies stemmed from malicious intent as the experimental comparison of different strategies often referred to as A/B testing used so commonly by online companies to evaluate approaches often has generated a move toward these approaches. However, continued use of these financially "productive" approaches by companies who now certainly know the consequences can still be questioned. (McNamee, 2019; Zuboff, 2019).

The value of personnel data increases the more different descriptive characteristics can be combined. Think of it this way. Any information you have about someone else provides you some advantage in future interactions. However, if you had the opportunity to discreetly observe the daily behavior of others you would have a much better feel for how to interact and perhaps manipulate them. A complete picture is far more helpful than an isolated fact or two. The more activity and the more different kinds of activity we engage in online the greater potential we offer others who have access to these data. The quest ends up being for the most complete collection of personal characteristics that can be assembled.

To collect as much and as varied data as are possible, sites seek to encourage heavy and exclusive use. Put another way, to generate a business advantage, sites are motivated to use strategies that generate heavy use. The strategies used to generate heavy use may not be in the best interest of users. Concerns with the amount of time any of us spends online (e.g., screen time) obviously conflict with the collective goals of online companies. While providing a service users will find valuable is an obvious way to encourage heavy and exclusive use, other techniques are also frequently included.

For example, behavioral principles using what behaviorists would describe as positive reinforcement applied on an intermittent (unpredictable) schedule are a great way to increase behavior. Those who design slot machines understand these principles. Those who design social media sites apply these principles whether intentionally or not.

As a second example, stimuli that encourage an emotional reaction are quite attractive. Content that offers these stimuli, whether prioritized by online services or highlighted by any of us as content creators or at least by any of us who share such content, attracts viewers and increases the collection of information about these users that is a byproduct of their online attention.
Hearing what you want to hear and seeing what you want to see.

To the extent that an online service delivers what you want it to deliver, you are likely to continue and perhaps even increase use of that service. However, those who endorse the concept of surveillance capitalism would argue that under some circumstances giving users what they want can be an inappropriate strategy for increasing use. Hence, there is a possible danger in a service that becomes too personal. We all have personal biases which means we are not completely neutral in the information we want. A phenomena termed confirmation bias implies that we interpret the information we receive and the events we encounter based on our own personal values and beliefs. In our second chapter, we described how the cognitive system uses existing knowledge to attend to and to interpret new experiences. This is how we use existing knowledge to help understand new experiences, but it is also how we can ignore or misinterpret new information. I will explore this issue further in my discussion of psychological processes.

Online sites can feed these personal biases in two different ways. Some social media sites allow users to prioritize the content offered by certain individuals. A term such as "friending" implies that we have some control over whom we interact with using these services. If we place too much emphasis on friending individuals with similar attitudes and beliefs, our own belief systems are likely to be supported rather than challenged. Systems that learn about us and use this information to prioritize certain information over other information can also feed existing biases. Pariser (2011) used the phrase "filter bubble" to refer to systems that prioritize online experiences based on what those systems have learned about us. If we assume we are experiencing a neutral or balanced assessment of the world from these systems, we have allowed ourselves to be misled. If we assume others are experiencing the online world in the say we are experience it, we also are misinformed. It should be obvious that the educational benefits of such personally biased inputs are not what educators would prefer.

It can be interesting to explore the filter bubble for yourself. A site developed by Duck Duck Go (a search engine that claims to be neutral) explains how bias due to the filter bubble can be investigated (Duck Duck Go).

The next section identifies and explains some of the strategies that either encourage greater use of online services (and the related sharing of personal information) or explain the consequences of these strategies for individuals and for addressing broader goals such as mental health or factual understanding. You likely encountered many of these factors in a course such as the Introduction to Psychology so the idea here is to identify these principles as they apply in the online world.

Contributing Factors


 
About | Outline | Copyright
about.html outline.html copyright.html