The Social Power of Algorithms
eBook - ePub

The Social Power of Algorithms

David Beer, David Beer

Compartir libro
  1. 156 páginas
  2. English
  3. ePUB (apto para móviles)
  4. Disponible en iOS y Android
eBook - ePub

The Social Power of Algorithms

David Beer, David Beer

Detalles del libro
Vista previa del libro
Índice
Citas

Información del libro

The vast circulations of mobile devices, sensors and data mean that the social world is now defined by a complex interweaving of human and machine agency. Key to this is the growing power of algorithms – the decision-making parts of code – in our software dense and data rich environments. Algorithms can shape how we are retreated, what we know, who we connect with and what we encounter, and they present us with some important questions about how society operates and how we understand it.

This book offers a series of concepts, approaches and ideas for understanding the relations between algorithms and power. Each chapter provides a unique perspective on the integration of algorithms into the social world. As such, this book directly tackles some of the most important questions facing the social sciences today. This book was originally published as a special issue of Information, Communication & Society.

Preguntas frecuentes

¿Cómo cancelo mi suscripción?
Simplemente, dirígete a la sección ajustes de la cuenta y haz clic en «Cancelar suscripción». Así de sencillo. Después de cancelar tu suscripción, esta permanecerá activa el tiempo restante que hayas pagado. Obtén más información aquí.
¿Cómo descargo los libros?
Por el momento, todos nuestros libros ePub adaptables a dispositivos móviles se pueden descargar a través de la aplicación. La mayor parte de nuestros PDF también se puede descargar y ya estamos trabajando para que el resto también sea descargable. Obtén más información aquí.
¿En qué se diferencian los planes de precios?
Ambos planes te permiten acceder por completo a la biblioteca y a todas las funciones de Perlego. Las únicas diferencias son el precio y el período de suscripción: con el plan anual ahorrarás en torno a un 30 % en comparación con 12 meses de un plan mensual.
¿Qué es Perlego?
Somos un servicio de suscripción de libros de texto en línea que te permite acceder a toda una biblioteca en línea por menos de lo que cuesta un libro al mes. Con más de un millón de libros sobre más de 1000 categorías, ¡tenemos todo lo que necesitas! Obtén más información aquí.
¿Perlego ofrece la función de texto a voz?
Busca el símbolo de lectura en voz alta en tu próximo libro para ver si puedes escucharlo. La herramienta de lectura en voz alta lee el texto en voz alta por ti, resaltando el texto a medida que se lee. Puedes pausarla, acelerarla y ralentizarla. Obtén más información aquí.
¿Es The Social Power of Algorithms un PDF/ePUB en línea?
Sí, puedes acceder a The Social Power of Algorithms de David Beer, David Beer en formato PDF o ePUB, así como a otros libros populares de Languages & Linguistics y Communication Studies. Tenemos más de un millón de libros disponibles en nuestro catálogo para que explores.

Información

Editorial
Routledge
Año
2019
ISBN
9781351200653

Thinking critically about and researching algorithms

Rob Kitchin

ABSTRACT
More and more aspects of our everyday lives are being mediated, augmented, produced and regulated by software-enabled technologies. Software is fundamentally composed of algorithms: sets of defined steps structured to process instructions/data to produce an output. This paper synthesises and extends emerging critical thinking about algorithms and considers how best to research them in practice. Four main arguments are developed. First, there is a pressing need to focus critical and empirical attention on algorithms and the work that they do given their increasing importance in shaping social and economic life. Second, algorithms can be conceived in a number of ways – technically, computationally, mathematically, politically, culturally, economically, contextually, materially, philosophically, ethically – but are best understood as being contingent, ontogenetic and performative in nature, and embedded in wider socio-technical assemblages. Third, there are three main challenges that hinder research about algorithms (gaining access to their formulation; they are heterogeneous and embedded in wider systems; their work unfolds contextually and contingently), which require practical and epistemological attention. Fourth, the constitution and work of algorithms can be empirically studied in a number of ways, each of which has strengths and weaknesses that need to be systematically evaluated. Six methodological approaches designed to produce insights into the nature and work of algorithms are critically appraised. It is contended that these methods are best used in combination in order to help overcome epistemological and practical challenges.

Introduction: why study algorithms?

The era of ubiquitous computing and big data is now firmly established, with more and more aspects of our everyday lives – play, consumption, work, travel, communication, domestic tasks, security, etc. – being mediated, augmented, produced and regulated by digital devices and networked systems powered by software (Greenfield, 2006; Kitchin & Dodge, 2011; Manovich, 2013; Steiner, 2012). Software is fundamentally composed of algorithms – sets of defined steps structured to process instructions/data to produce an output – with all digital technologies thus constituting ‘algorithm machines’ (Gillespie, 2014a). These ‘algorithm machines’ enable extensive and complex tasks to be tackled that would be all but impossible by hand or analogue machines. They can perform millions of operations per second; minimise human error and bias in how a task is performed; and can significantly reduce costs and increase turnover and profit through automation and creating new services/products (Kitchin & Dodge, 2011). As such, dozens of key sets of algorithms are shaping everyday practices and tasks, including those that perform search, secure encrypted exchange, recommendation, pattern recognition, data compression, auto-correction, routing, predicting, profiling, simulation and optimisation (MacCormick, 2013).
As Diakopoulos (2013, p. 2) argues: ‘We’re living in a world now where algorithms adjudicate more and more consequential decisions in our lives. … Algorithms, driven by vast troves of data, are the new power brokers in society.’ Steiner (2012, p. 214) thus contends:
algorithms already have control of your money market funds, your stocks, and your retirement accounts. They’ll soon decide who you talk to on phone calls; they will control the music that reaches your radio; they will decide your chances of getting lifesaving organs transplant; and for millions of people, algorithms will make perhaps the largest decision of in their life: choosing a spouse.
Similarly, Lenglet (2011), MacKenzie (2014), Arnoldi (2016), Pasquale (2015) document how algorithms have deeply and pervasively restructured how all aspects of the finance sector operate, from how funds are traded to how credit agencies assess risk and sort customers. Amoore (2006, 2009) details how algorithms are used to assess security risks in the ‘war on terror’ through the profiling passengers and citizens. With respect to the creation of Wikipedia, Geiger (2014, p. 345) notes how algorithms ‘help create new articles, edit existing articles, enforce rules and standards, patrol for spam and vandalism, and generally work to support encyclopaedic or administrative work.’ Likewise, Anderson (2011) details how algorithms are playing an increasingly important role in producing content and mediating the relationships between journalists, audiences, newsrooms and media products.
In whatever domain algorithms are deployed they appear to be having disruptive and transformative effect, both to how that domain is organised and operates, and to the labour market associated with it. Steiner (2012) provides numerous examples of how algorithms and computation have led to widespread job losses in some industries through automation. He concludes
programmers now scout new industries for soft spots where algorithms might render old paradigms extinct, and in the process make mountains of money … Determining the next field to be invaded by bots [automated algorithms] is the sum of two simple functions: the potential to disrupt plus the reward for disruption. (Steiner, 2012, pp. 6, 119)
Such conclusions have led a number of commentators to argue that we are now entering an era of widespread algorithmic governance, wherein algorithms will play an ever-increasing role in the exercise of power, a means through which to automate the disciplining and controlling of societies and to increase the efficiency of capital accumulation. However, Diakopoulos (2013, p. 2, original emphasis) warns that: ‘What we generally lack as a public is clarity about how algorithms exercise their power over us.’ Such clarity is absent because although algorithms are imbued with the power to act upon data and make consequential decisions (such as to issue fines or block travel or approve a loan) they are largely black boxed and beyond query or question. What is at stake then with the rise of ‘algorithm machines’ is new forms of algorithmic power that are reshaping how social and economic systems work.
In response, over the past decade or so, a growing number of scholars have started to focus critical attention on software code and algorithms, drawing on and contributing to science and technology studies, new media studies and software studies, in order to unpack the nature of algorithms and their power and work. Their analyses typically take one of three forms: a detailed case study of a single algorithm, or class of algorithms, to examine the nature of algorithms more generally (e.g., Bucher, 2012; Geiger, 2014; Mackenzie, 2007; Montfort et al., 2012); a detailed examination of the use of algorithms in one domain, such as journalism (Anderson, 2011), security (Amoore, 2006, 2009) or finance (Pasquale, 2014, 2015); or a more general, critical account of algorithms, their nature and how they perform work (e.g., Cox, 2013; Gillespie, 2014a, 2014b; Seaver, 2013).
This paper synthesises, critiques and extends these studies. Divided into two main sections – thinking critically about and researching algorithms – the paper makes four key arguments. First, as already noted, there is a pressing need to focus critical and empirical attention on algorithms and the work that they do in the world. Second, it is most productive to conceive of algorithms as being contingent, ontogenetic, performative in nature and embedded in wider socio-technical assemblages. Third, there are three main challenges that hinder research about algorithms (gaining access to their formulation; they are heterogeneous and embedded in wider systems; their work unfolds contextually and contingently), which require practical and epistemological attention. Fourth, the constitution and work of algorithms can be empirically studied in a number of ways, each of which has strengths and weaknesses that need to be systematically evaluated. With respect to the latter, the paper provides a critical appraisal of six methodological approaches that might profitably be used to produce insights into the nature and work of algorithms.

Thinking critically about algorithms

While an algorithm is commonly understood as a set of defined steps to produce particular outputs it is important to note that this is somewhat of a simplification. What constitutes an algorithm has changed over time and they can be thought about in a number of ways: technically, computationally, mathematically, politically, culturally, economically, contextually, materially, philosophically, ethically and so on.
Miyazaki (2012) traces the term ‘algorithm’ to twelfth-century Spain when the scripts of the Arabian mathematician Muḥammad ibn Mūsā al-Khwārizmī were translated into Latin. These scripts describe methods of addition, subtraction, multiplication and division using numbers. Thereafter, ‘algorism’ meant ‘the specific step-by-step method of performing written elementary arithmetic’ (Miyazaki, 2012, p. 2) and ‘came to describe any method of systematic or automatic calculation’ (Steiner, 2012, p. 55). By the mid-twentieth century and the development of scientific computation and early high level programming languages, such as Algol 58 and its derivatives (short for ALGOrithmic Language), an algorithm was understood to be a set of defined steps that if followed in the correct order will computationally process input (instructions and/or data) to produce a desired outcome (Miyazaki, 2012).
From a computational and programming perspective an ‘Algorithm = Logic + Control’; where the logic is the problem domain-specific component and specifies the abstract formulation and expression of a solution (what is to be done) and the control component is the problem-solving strategy and the instructions for processing the logic under different scenarios (how it should be done) (Kowalski, 1979). The efficiency of an algorithm can be enhanced by either refining the logic component or by improving the control over its use, including altering data structures (input) to improve efficiency (Kowalski, 1979). As reasoned logic, the formulation of an algorithm is, in theory at least, independent of programming languages and the machines that execute them; ‘it has an autonomous existence independent of “implementation details”’ (Goffey, 2008, p. 15).
Some ideas explicitly take the form of an algorithm. Mathematical formulae, for example, are expressed as precise algorithms in the form of equations. In other cases problems have to be abstracted and structured into a set of instructions (pseudo-code) which can then be coded (Goffey, 2008). A computer programme structures lots of relatively simple algorithms together to form large, often complex, recursive decision trees (Neyland, 2015; Steiner, 2012). The methods of guiding and calculating decisions are largely based on Boolean logic (e.g., if this, then that) and the mathematical formulae and equations of calculus, graph theory and probability theory. Coding thus consists of two key translation challenges centred on producing algorithms. The first is translating a task or problem into a structured formula with an appropriate rule set (pseudo-code). The second is translating this pseudo-code into source code that when compiled will perform the task or solve the problem. Both translations can be challenging, requiring the precise definition of what a task/problem is (logic), then breaking that down into a precise set of instructions, factoring in any contingencies such as how the algorithm should perform under different conditions (control). The consequences of mistranslating the problem and/or solution are erroneous outcomes and random uncertainties (Drucker, 2013).
The processes of translation are often portrayed as technical, benign and commonsensical. This is how algorithms are mostly presented by computer scientists and technology companies: that they are ‘purely formal beings of reason’ (Goffey, 2008, p. 16). Thus, as Seaver (2013) notes, in computer science texts the focus is centred on how to design an algorithm, determine its efficiency and prove its optimality from a purely technical perspective. If there is discussion of the work algorithms do in real-world contexts this concentrates on how algorithms function in practice to perform a specific task. In other words, algorithms are understood ‘to be strictly rational concerns, marrying the certainties of mathematics with the objectivity of technology’ (Seaver, 2013, p. 2). ‘Other knowledge about algorithms – such as their applications, effects, and circulation – is strictly out of frame’ (Seaver, 2013, pp. 1–2). As are the complex set of decision-making processes and practices, and the wider assemblage of systems of thought, finance, politics, legal codes and regulations, materialities and infrastructures, institutions, inter-personal relations, which shape their production (Kitchin, 2014).
Far from being objective, impartial, reliable and legitimate, critical scholars argue that algorithms possess none of these qualities except as carefully crafted fictions (Gillespie, 2014a). As Montfort et al. (2012, p. 3) note, ‘[c]ode is not purely abstract and mathematical; it has significant social, political, and aesthetic dimensions,’ inherently framed and shaped by all kinds of decisions, politics, ideology and the materialities of hardware and infrastructure that enact its instruction. Whilst programmers might seek to maintain a high degree of mechanical objectivity – being distant, detached and impartial in how they work and thus acting independent of local customs, culture, knowledge and context (Porter, 1995) – in the process of translating a task or process or calculation into an algorithm they can never fully escape these. Nor can they escape factors such as available resources and the choice and quality of training data; requirements relating to standards, protocols and the law; and choices and conditionalities relating to hardware, platforms, bandwidth and languages (Diakopoulos, 2013; Drucker, 2013; Kitchin & Dodge, 2011; Neyland, 2015). In reality then, a great deal of expertise, judgement, choice and constraints are exercised in producing algorithms (Gillespie, 2014a). Moreover, algorithms are created for purposes that are often far from neutral: to create value and capital; to nudge behaviour and structure preferences in a certain way; and to identify, sort and classify people.
At the same time, ‘programming is … a live process of engagement between thinking with and working on materials and the problem space that emerges’ (Fuller, 2008, p. 10) and it ‘is not a dry technical exercise but an exploration of aesthetic, material, and formal qualities’ (Montfort et al., 2012, p. 266). In other words, creating an algorithm unfolds in context through processes such as trial and error, play, collaboration, discussion and negotiation. They are ontogenetic in nature (always in a state of becoming), teased into being: edited, revised, deleted and restarted, shared with others, passing through multiple iterations stretched out over time and space (Kitchin & Dodge, 2011). As a result, they are always somewhat uncertain, provisional and messy fragile accomplishments (Gillespie, 2014a; Neyland, 2015). And such practices are complemented by many others, such as researching the concept, selecting and cleaning data, tuning parameters, selling the idea and product, building coding teams, raising finance and so on. These practices are framed by systems of thought and forms of knowledge, modes of political economy, organisational and institutional cultures and politics, governmentalities and legalities, subjectivities and communities. As Seaver (2013, p. 10) notes, ‘algorithmic systems are not standalone little boxes, but massive, networked ones with hundreds of hands reaching into them, tweaking and tuning, swapping out parts and experimenting with new arrangements.’
Creating algorithms thus sits at the ‘intersection of dozens of … social and material practices’ that are culturally, historically and institutionally situated (Montfort et al., 2012, p. 262; Napoli, 2013; Takhteyev, 2012). As such, as Mackenzie (2007, p. 93) argues treating algorithms simply ‘as a general expression of mental effort, or, perhaps even more abstractly, as process of abstraction, is to lose track of proximities and relationalities that algorithms articulate.’ Algorithms cannot be divorced from the conditions under which they are developed and deployed (Geiger, 2014). What this means is that algorithms need to be understood as relational, contingent, contextual in nature, framed within the wider context of their socio-technical assemblage. From this perspective, ‘algorithm’ is one element in a broader apparatus which means it can never be understood as a technical, objective, impartial form of knowledge or mode of operation.
Beyond thinking critically about the n...

Índice