Privacy, Due Process and the Computational Turn
eBook - ePub

Privacy, Due Process and the Computational Turn

The Philosophy of Law Meets the Philosophy of Technology

  1. 258 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Privacy, Due Process and the Computational Turn

The Philosophy of Law Meets the Philosophy of Technology

About this book

Privacy, Due process and the Computational Turn: The Philosophy of Law Meets the Philosophy of Technology engages with the rapidly developing computational aspects of our world including data mining, behavioural advertising, iGovernment, profiling for intelligence, customer relationship management, smart search engines, personalized news feeds, and so on in order to consider their implications for the assumptions on which our legal framework has been built. The contributions to this volume focus on the issue of privacy, which is often equated with data privacy and data security, location privacy, anonymity, pseudonymity, unobservability, and unlinkability. Here, however, the extent to which predictive and other types of data analytics operate in ways that may or may not violate privacy is rigorously taken up, both technologically and legally, in order to open up new possibilities for considering, and contesting, how we are increasingly being correlated and categorizedin relationship with due process – the right to contest how the profiling systems are categorizing and deciding about us.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Privacy, Due Process and the Computational Turn by Mireille Hildebrandt, Katja de Vries, Mireille Hildebrandt,Katja de Vries,Katja De Vries in PDF and/or ePUB format, as well as other popular books in Computer Science & Computer Science General. We have over one million books available in our catalogue for you to explore.

Information

1


Privacy, due process and the computational turn

A parable and a first analysis
Katja de Vries*

The parable of the three robotic dogs

Once upon a time, in a land far, far away, there were three families who each owned a robotic dog. The robotic dogs were a great source of entertainment for their owners: the family members enjoyed playing with them the same way as one would with a real dog. Next to their roles as loyal canine companions, the dogs were also supposed to patrol around the house of the families and protect them from ‘suspect types’,1 such as potential burglars, child molesters and any other unwelcome intruders. The eyes of these robotic dogs registered all the passers-by and stored their image and gait. Whenever a dog spotted a ‘suspect type’ around the family house it would bark, set off the alarm system, contact all family members and call the police. If the ‘suspect type’ continued to approach the house, the dog could set off tear gas. The instruction manual opened with:
Congratulations! You have made a great choice. No real dog could ever provide the same level of security as your new robotic pet. No burglar can distract your robotic dog with a simple piece of sausage. The robotic dog will never waggle its tail at an unwelcome guest, because you instruct it exactly about who should be kept away from your family. Robo-dog has an extremely user-friendly interface: just talk to the dog as you would to a child.2 And which real dog would be able to call the police if they spot a suspicious type? Precisely. But your robotic dog will.
The manual offered different ways of instructing the dog about who qualifies as a ‘suspect type’. Each family picked a different strategy. The first family had very strict ideas about who could be trusted and who not, so they decided to pre-program the dog with a set of fixed rules reflecting their ideas. The family compiled a long and complex set of rules such as: ‘Somebody is suspect if possessing the characteristics “adult” and “passing by more than three times within a time span of one hour”, or the characteristics “bearded” and “climbing” and is not a “family member”, or the characteristics […] or […] etc.’ They instructed the dog to proceed with the alarm routine if it detected the presence of a set of characteristics described by their rules. On the eve of the dog's first patrol the mother had looked sternly into the eyes of the dog and said: ‘You rule the street, but we rule you, understood?’ From that day onwards all the family members called the dog by the name ‘Ruler’.3
The second family said: ‘We personally think it is quite easy to distinguish suspect types from trustworthy ones, but we have difficulty articulating what the common denominators are that give them away.’ So instead of giving their dog a precise definition of what to look for they would point out passersby who looked trustworthy to them and those they considered untrustworthy. Thus, in contrast to the first family they did not provide their metallic pet with any explicit rules, but merely with examples of passers-by they had labelled as either ‘trustworthy’ or ‘suspect’. The dog was told to compare every new passer-by to the examples it was given and to classify the passer-by in the same category as the example that was the most similar to it. Because ‘similarity’ is quite an equivocal concept (just think of how opinions can differ on which siblings in a family are most alike) the family gave a formalised definition of similarity. The family explained to the dog that ‘similarity’ should be understood as ‘closest in terms of Euclidian distance’. They also told the dog which characteristics to take into account, how to represent these characteristics as points in Euclidian space and how to calculate the distance between those points. Observing how the dog assigned each passer-by to the class of the closest labelled example, the family named the dog ‘Closer’.
The third family thought it would be silly to impose their own stereotypes onto the dog. ‘Is it not precisely the problem that intruders make use of our prejudiced expectations? The good thing about this robotic dog is that it is not burdened by such prejudices and maybe we can learn something from its naive perspective’, they pondered. Thus, the family told the dog, whom they had named ‘Cluster’: ‘Now go outside and after a month you can tell us what categories of people you think there are. Let us begin with four different categories, ok, Cluster?’ Although the third family had not provided the dog with any explicit definition of ‘suspect’ behaviour (such as family 1) or given it any examples of what it should look for (such as family 2), it had instructed the dog with a very precise definition of similarity in terms of Euclidian distance and told it to come up with four classes.
Later that week Ruler, the robotic dog of the first family, called the police after noticing the characteristics ‘beard’, ‘adult’ and ‘climbing’ when a roofer came to replace a few missing tiles on the roof. The family had to come over to the police station and explain the misunderstanding. The family realised that making a list of rules defining ‘suspect types’ was not that easy after all. The second family, who had tried to teach their dog Closer to recognise ‘suspect types’ by providing it with examples, also had some mixed feelings about the result. If Closer barked at people whom the owners did not consider suspect, they could correct the dog so that it could adjust its ‘model’.4 However, the mere fact that Closer barked at a passer-by made this particular person seem more suspect.5 A person who might have looked harmless to the owners before now gained an aura of potential danger. So, in practice the owners did not correct their dog very often. As time went by they began to rely increasingly on the judgment of Closer.
Cluster, the dog of the third family, strolled around for a month in the surroundings of the family home and then came up with a very elaborate categorisation of people involving, among other things, the colour of their socks, the rhythm of their walking pace and the presence of jewellery on their fingers. ‘Isn't that exciting?’, the father said. ‘Without any explicit instruction Cluster came up with this categorisation that eludes me but probably is a very truthful one, because we have not influenced our robotic pet with our prejudices and ingrained ideas.’ (The father did not give much thought to how he had biased the family dog by defining similarity in terms of proximity, by providing it with a long list of characteristics that possibly could be of interest and by telling it that it had to classify people in four categories.) The mother added: ‘It seems to me that what Cluster describes as “category three” is the type of person we should be suspicious of.’ Soon the whole family became convinced that Cluster had unearthed a very profound categorisation. When the dog barked and signalled that a ‘category three’ person was passing by, the whole family shivered. Sometimes they had difficulty recognising these ‘category three’ people themselves but they had great trust in the judgment of their robotic dog.
One day the three families decided to organise a garden party. They had invited all their neighbours but none had showed up. People were fed up with these metallic surveillance machines. The three families who owned the robotic dogs belonged to the ruling elite of the community and the robotic dogs had merely accentuated already existing social differences. People were annoyed with these devices that recorded everything they saw and moreover had the capacity to contact the police independently of their owners. The three families cared little about these feelings of their neighbours. Now that they were the only guests at their own party, they had plenty of opportunity to observe and discuss the behaviour of their robotic dogs. To their surprise, they noticed that there was hardly any overlap between the passers-by at which Ruler, Closer and Cluster would bark.
What is the moral of this parable? Normally, a parable is directly followed by the lessons that can be drawn from it. However, because the moral of this parable is based on the various ideas, analyses and recommendations presented in this volume, I will first present some more conventional introductory observations regarding the contents of this book. In the last section of this introductory chapter, I will finally return to the ‘parable of the three robotic dogs’ and suggest some lessons it could teach us about the computational turn, due process and privacy.

Family resemblance concepts and kaleidoscopic patterns

The ‘computational turn’, ‘due process’ and ‘privacy’: all the contributions presented in this volume engage with these three notions; notions that are of utmost importance for every contemporary information society intending to function in accordance with constitutional and fundamental human rights. However, as much as the three notions are important, they are also complex, multifaceted and sometimes even evanescent. Solove's observation that ‘[privacy] seems to be about everything, and therefore it appears to be nothing’ (2006: 479), could easily be extended to ‘due process’ and the ‘computational turn’. To make things worse, the notions consist of a multiplicity of concepts that share some traits but only share a minimal, if any, common core. For example, a privacy interference can refer to concepts of privacy as disparate as ‘surveillance’ (eg wiretapping by the police), ‘decisional privacy’ (eg a state does not allow a woman to decide for herself whether to have an abortion or not) and ‘exposure’ (eg a newspaper that publishes candid pictures of a naked celebrity) (Solove 2006). Nevertheless, this does not make the notions ‘computational turn’, ‘due process’ and ‘privacy’ incoherent or useless. Like members of a family, the different uses of these notions form ‘a complicated network of similarities overlapping and criss-crossing’ (Wittgenstein 1997: §66 as quoted in: Solove 2006: 485). The importance of these notions in contemporary debates might partly be explained precisely by their multifaceted and contested natures. An invocation such as: ‘Let us not forget the requirements of due process!’ will hardly provide a straightforward instruction. More likely it will act as an invitation to discuss what those requirements are. Instead of simplifying the meaning of the ‘computational turn’, ‘due process’ and ‘privacy’, the contributions in this volume bear witness to their multifaceted nature and their capacity to oblige us to think, debate and rethink our epistemology (how to compute valid, reliable and valuable knowledge?) and political constitution (which power balance should we preserve in our society, and how can we articulate those requirements in terms of due process and privacy?). Taking the provocative, obliging and multifaceted character of the notions ‘computational turn’, ‘privacy’ and ‘due process’ seriously has resulted in this volume, offering a polyphony (Bakhtin 1984: 6) of perspectives and analyses on the relations between these three notions: ‘A plurality of independent and […] fully valid voices […] with equal rights and each with its own world’. This polyphony is further enhanced by the diverging disciplinary backgrounds of the contributors: a specialist in machine learning, philosophers and sociologists of technology, and legal philosophers. Nevertheless, there are some shared characteristics as well. One aspect that all the contributions have in common is their engaged outlook. Epistemological questions are never treated in a strictly analytical manner, as is sometimes done within philosophy of science (Harizanov et al 2007; Harman and Kulkarni 2007; Thagard 1990), but are always related to the power relations between different actors, to the choices which policy-makers, engineers and citizens face in the present computational era and to the way these choices affect the constitutions of our societies. Another shared characteristic is the appetite for unconventional and novel approaches: for example, data minimisation, a classical and venerated pillar of data protection, is (at least partly) rejected by all authors in favour of other solutions. To give an impression of some of the solutions6 proposed by the authors, see the following tentative listing:
  • ‘effective outcome transparency instead of the current focus on nominal procedural transparency (Koops)
  • obfuscation tactics, which make the process of profiling more time consuming, costly, complex and difficult (Finton and Nissenbaum)
  • a duty to acquire knowledge of the profiling artefacts (Magnani)
  • explicitation of the profiling process by using experiment databases (van Otterlo)
  • taking into account that profiling algorithms should not only be made on behalf of the state or companies, but also of the citizen who is subjected to them (Kerr)
  • increasing the amount of available data and profiles to make it more difficult to get univocal indications (Esposito)
  • re-introducing actual persons, objects and situations in profiling practices which operate mostly on the infra- and supra-individual level (Rouvroy)
  • developing profiles transparency in the front-end, the interface and backend of the computational decision systems that have a significant impact on our lives (Hildebrandt).
A third trait shared by all contributions is that they are not simply concerned with data collection, storage, processing or retrieval as su...

Table of contents

  1. Cover
  2. Half Title
  3. Title Page
  4. Copyright Page
  5. Table of Contents
  6. Acknowledgments
  7. Notes on Contributors
  8. Preface
  9. Introduction Privacy, due process and the computational turn at a glance: pointers for the hurried reader
  10. 1 Privacy, due process and the computational turn: a parable and a first analysis
  11. Part 1 Data science
  12. Part 2 Anticipating machines
  13. Part 3 Resistance & solutions
  14. Index