Philosophy and Computing
eBook - ePub

Philosophy and Computing

An Introduction

  1. 256 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Philosophy and Computing

An Introduction

About this book

Philosophy and Computing explores each of the following areas of technology: the digital revolution; the computer; the Internet and the Web; CD-ROMs and Mulitmedia; databases, textbases, and hypertexts; Artificial Intelligence; the future of computing.
Luciano Floridi shows us how the relationship between philosophy and computing provokes a wide range of philosophical questions: is there a philosophy of information? What can be achieved by a classic computer? How can we define complexity? What are the limits of quantam computers? Is the Internet an intellectual space or a polluted environment? What is the paradox in the Strong Artificial Intlligence program?
Philosophy and Computing is essential reading for anyone wishing to fully understand both the development and history of information and communication technology as well as the philosophical issues it ultimately raises.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Philosophy and Computing by Luciano Floridi in PDF and/or ePUB format, as well as other popular books in Philosophy & Philosophy History & Theory. We have over one million books available in our catalogue for you to explore.

Information

Chapter 1

Divide et computa: philosophy and the digital environment

The digital revolution

Information and communication technology (ICT) has shaped the second half of the twentieth century irreversibly and more profoundly than atomic energy or space exploration. We may well do without nuclear power stations or the space shuttle, but nobody can reasonably conceive of a future society in which there are no more computers, no matter what dangers may be inherent in such an ineluctable evolution of our habitat. Evidence of a digital destiny is everywhere. In 1971, Intel launched the world’s first commercial microprocessor, the 4004 (Intel is to hardware as Microsoft is to software: its chips account for 80 per cent of world-wide sales of general-purpose microprocessors), and some twenty-five years later microprocessors are hidden in virtually any technological device supporting our social activities, financial operations, administrative tasks or scientific research. At the beginning of 1998, experts estimated that there were more than 15 billion chips operating all over the world. In advanced societies, computer chips have become as widely diffused as the engines they control and ICT is today a highly pervasive, infra-technology that will eventually cease to be noticed because it has become thoroughly trivial.1 Of course, many past predictions were simplistic exaggerations: tele-democracy, the paperless office, the workerless factory, the digital classroom, the cashless society, the electronic cottage, tele-jobs, or a computer generation were either chimeras dreamed up by overenthusiastic high-tech prophets, or just smart marketing slogans, for technological changes require longer periods of time and have a more complex nature. Nevertheless, it is hard to deny that our reliance on computer systems is a macroscopic phenomenon, and constantly growing. Even in science fiction the only consistent scenarios we can imagine are those in which highly sophisticated and powerful computers have become imperceptible because completely transparent to their users, overfamiliar objects behaving as ordinary components of the environment. Cash and capital were the lifeblood of our economies, but now they are being replaced by information, and computers are the best tools we have to manage it. Unless we find something better, they are here to stay and will play an ever more important role in our lives.
The knowledge economy apart, the extraordinary pervasiveness of ICT has been a direct effect of the increasing user-friendliness, powerfulness, flexibility, reliability, and affordability (inexpensiveness in terms of price/performance ratio) of computing devices. The virtuous circle created by the mutual interaction between these five factors has caused the merging of the three Cs (computers, communications and consumer electronics). Contemporary microelectronics, the discovery and manufacture of new materials, the elaboration of better algorithms and software, and the design of more efficient architectures are some of the factors that have made possible the extraordinary development of computing devices with enormous and yet steadily increasing computational power, which can easily be adapted to a very wide variety of humble tasks, at a cost that is constantly decreasing, sometimes in absolute terms. In the 1960s, Gordon Moore, the co-founder of Intel, predicted that the number of transistors that could be placed on a single chip would double every eighteen months. If Moore’s famous law is still true fifty years after he proposed it – and there is currently no reason to think that it will not be – then at the beginning of the 2020s microprocessors may well be as much as 1000 times computationally more powerful than the Pentium III chip we were so proud of only yesterday, yet a child will be able to use them efficiently.
What we call “the information society” has been brought about by the fastest growing technology in history. No previous generation has ever been exposed to such an extraordinary acceleration of technological power and corresponding social changes. No wonder that the computer has become a symbol of the second half of the twentieth century and even of the new millennium, playing a cultural role comparable to that of mills in the Middle Ages, mechanical clocks in the seventeenth century and the loom or the steam engine in the age of the Industrial Revolution. Total pervasiveness and high power have raised ICT to the status of the characteristic technology of our time, both rhetorically and iconographically (Pagels 1984; Bolter 1984 provides a cultural perspective; see Lyon 1988 for a sociological analysis). The computer is a defining technology.
Such profound transformations have been already interpreted in terms of a digital revolution. It is a rather extreme view, and its epochal overtones may perhaps be unjustified, but those who reject it should consider that computer science and ICT applications are nowadays the most strategic of all the factors governing the life of our society and its future. The most developed post-industrial societies now literally live by information, and digital ICT is what keeps them constantly oxygenated. I am not referring just to the world of entertainment – perhaps we can do without it, though its economy is becoming a dominating variable in developed societies – but to fields such as economics and international politics.
In advanced economies, computing is undoubtedly a crucial element, if not by far the leading factor determining the possibility of success (Dertouzos 1997; Tapscott 1996; Tapscott and Caston 1993). It is not that a fully computerised company will necessarily be successful, but rather that a non-computerised company will stand no chance of survival in a competitive market. If digital technology is causing unemployment among blue- and white-collar workers, eroding job opportunities in some manufacturing and service sectors, it is also true that it has created completely new fields of investment for human and financial resources. This is so much the case that it is still unclear whether, in the long run, the phenomenon may not be more accurately interpreted as a displacement and reallocation of the working force. Like other technological innovations, computers can certainly be used to make workforces redundant or de-skilled, reducing people to the condition of low-waged machine-minders, but at the same time they can be used to enhance the quality of working life, improve workers’ job satisfaction and responsibility, and re-skill the workforce. The real point is that technological determinism is unjustified and that work organisation and the quality of working life depend much more on overall managerial philosophies and strategic decisions than on the simple introduction of a particular type of technology. It is obviously premature to talk of a cashless society, but although it is not yet a digital market, the market in the digital is a flourishing economic area, which can partially balance the negative effect that computerisation has on job opportunities. In 1997, 25 per cent of America’s real economic growth was generated by the PC industry, while international sales of ICT amounted to $610 billion, making ICT the largest industrial sector in the world. As for the convergence and globalisation of regional economies, this is only a macroscopic effect of the more fundamental phenomenon represented by the development of a world-wide system of digital communication, through which information can be exchanged anywhere, in real time, constantly, cheaply and reliably.
In international politics, digital technology showed its muscle in the Gulf War. In less than two months (16 January to 28 February 1991) guided missiles, “smart” munitions, night vision equipment, infra-red sensors, Global Positioning Systems, cruise missiles, free-flight rockets with multiple warheads, anti-missile missiles, and modern data communications systems helped to destroy a 500,000-strong Iraqi army. It was the defeat of an “object-oriented” army by an information-based superpower. The digital revolution has brought about transformations in modern technological warfare comparable only to the industrial production of weapons, or the invention of tanks and military aircraft. No power will ever be an international superpower without an army of computer engineers, not least because of the unreliability of our machines, as the tragic killing of 290 innocent civilians in an Iranian airbus, mistakenly brought down by a US guided missile during the conflict, will remind us for ever.

The four areas of the digital revolution

Information has matured into an asset of growing value, with marketable quantities and prices. It is the new digital gold and is one of the most valuable resources at our disposal. We all know this and this is why we are ready to describe our society as the information society. In the information society, changes in social standards are ever more deeply and extensively ICT-driven or induced. Such modifications in the growth, the fruition and the management of information resources and services concern four main sectors: computation, automatic control, modelling, and information management. This sequence follows a conceptual order and only partially overlaps through time.

Computation

“Computation” seems to be a sufficiently intuitive concept, but as soon as we try to provide a clear and fully satisfactory definition of it we immediately realise how difficult the task is (Cutland 1980). According to different perspectives, computation may be described as a logical or physical process of generation of final states (outputs) from initial states (inputs), based on:

1. rule-governed state-transitions, or
2. discrete or digital rule-governed state-transitions, or
3. a series of rule-governed state-transitions for which the rule can be altered, or
4. rule-governed state-transitions between interpretable states.

There are some difficulties with these definitions. (1) is too loose, for it also applies to devices such as printers and washing machines, physical systems that we do not include in the class of computational systems; (2) is perhaps too strict, for it excludes forms of analogue computation (I shall discuss this point at greater length when talking about Super Turing machines); (3) is either vague or too strict, for there are computational systems, like pocket calculators, with embedded rules (nonprogrammable algorithms); (4) seems to be the most satisfactory. It makes the interpretable representation of a state a necessary condition for computation. However, even (4) is not uncontroversial because it is not altogether clear whether state-transitions of quantum computers (see Chapter 5) – which are computational systems that belong to the same class of Turing machines – are all interpretable representations in the same sense in which all states of a so-called Turing machine are, given the fact that it is impossible to gain complete knowledge of the state of a quantum register through measurement, as we shall see later (measuring or describing by direct inspection the states of a quantum register alters them, contrary to what happens in a “Newtonian” device). The same problem arises if we consider the brain a computational system. We shall see in the next chapter that it may be easier to explain the concept of “computability” by referring precisely to a Turing machine and its properties. For the time being, let us rely on our intuitive understanding of the concept.
That a computer is above all a computational system, in the sense of a straightforward number cruncher, is a trivial truth hardly worth mentioning. The earliest and most direct applications of computers concerned the elaboration and advancement of quantifiable information. The remarkable technological and scientific developments of the second half of the twentieth century owe a great deal to the possibility of solving a huge number of mathematical and computational problems practically (though not in principle) beyond the power of our finite, biological brains, and hence of carrying out operations which would not have been humanly sustainable either in terms of organisation or time-consumption, even if armies of assistants had been available. This has been true since the beginning of the history of computer science. One of the first high-speed electronic digital computers (EDC), known as ENIAC (Electronic Numerical Integrator and Computer) was built in 1945 by John W. Mauchly for the American army, in order to calculate complex ballistic firing tables. There was no human–computer interface, as we understand the concept nowadays, because the user was literally inside the computer, given its dimensions, and programming was performed with a screwdriver. ENIAC occupied 167.3 square metres, consisted of 17,469 vacuum tubes, had a speed of several hundred multiplications per minute, could perform 5,000 additions per second, and was extremely difficult to program, since its “software” was wired into the processor and had to be manually altered, a problem definitively solved only in the 1950s, with the construction of UNIVAC 1 (Universal Automatic Computer) and then IBM’s first mainframes, which adopted more efficient architectures. In this respect, ENIAC, UNIVAC, pocket calculators or the latest personal computers all belong to the long history of computational devices which stretches from the abacus to Pascal’s adding machine (1642), through Leibniz’s multiplication machine (1671) to Babbage’s analytical engine (1835), but I shall say more about this in the next chapter.

Automatic control

Although computation has remained a major area of application, it would be shortsighted to think that the impact of the technological innovations brought about by the diffusion of digital ICT has been limited just to straightforward numerical problems, and hence to important but quite specific sectors of mathematical applications. For not only have computers helped us to read some of the most complex chapters in the “mathematical book of nature”, they also have put us in a position to control a large variety of physical and bureaucratic processes automatically (office automation and EDP, electronic data processing). Today, the complex functioning of an increasing number of manufacturing and administrative operations, including the manipulation of consumption, requires the constant intervention of microprocessors and other digital devices. Following on from the process of mechanisation, computers have caused a second industrial revolution through the implementation of massive automation. As industry has moved from a low-technology, unspecialised, and labour-intensive stage to a highly mechanised, automated (electronics), AT-intensive (advanced technology) and more specialised stage, it has become extensively information-based and hence more and more computer-dependent. Thus, we speak of

• computer-numerically-controlled (CNC) machines that are employed to repeat operations exactly as programmed, rapidly and accurately;
• computer-aided manufacturing (CAM), i.e. the use of computers in factories to control CNC machine tools, robots and whole production processes cheaply and virtually without operators;
• computer-aided design (CAD), i.e. the use of computers to design and model most, when not all, of the features of a particular product (such as its size, shape, the form of each of its component parts, its overall structure) by storing data as two- or three-dimensional drawings, and hence to simulate its performance and make it easier for large groups of designers and engineers to work on the same project even when they are in different locations;
• computer-controlled logistics (CCL);
• computer-integrated manufacturing (CIM) systems, i.e. methods of manufacturing resulting from the combination of CNC + CAM + CAD + CCL, that allow designers, engineers and managers to project, model, simulate, test and produce a particular product more efficiently and quickly, while also monitoring the whole process.

Computers are today so commonly used to plan whole manufacturing processes, to test finished parts, to activate, carry out, regulate and control whole phases in the production process, that virtually all aspects of the manufacturing of goods – including better working conditions and safety measures, constant progress in the implementation of cost-saving procedures, the improvement of the quality/price ratio of finished products and the semi-customisation of goods (tailoring) – are thoroughly dependent on the evolution, application and diffusion of computing in all its varieties. The possibility of the fully automated AT factory, in which human intervention is limited to overall control and decision-making, is only one more effect of the digital revolution.

Modelling and virtual reality

The mathematical description and the digital control of the physical environment have provided the solid premisses for its potential replacement by mathematical models (systems of differential equations) in scientific computing (i.e. “the collection of tools, techniques and theories required to solve on a computer mathematical models of problems in science and engineering”, Golub and Ortega 1993: 2) and virtual reality environments (Aukstakalnis and Blatner 1992; Rheingold 1991; Woolley 1992; Heim 1993 provides a “Techno-Taoist” [sic] approach; on the use of IT for philosophical modelling see Grim et al. 1998). Digital computing has become crucial whenever it is necessary to simulate real-life properties and forecast the behaviour of objects placed in contexts that are either not reproducible in laboratory situations or simply not testable at all, whether for safety reasons, for example, or because of the high cost of building and testing physical prototypes, or because we need non-invasive and non-destructive techniques of analysis, as in medical contexts. Bi- or tri-dimensional (2D and 3D) digital images are produced either by drawing them with a conventional paint program, as in many software games–we may call this the “imagist” approach – or by means of computer graphics (used by engineers, architects or interior designers, for example). In the latter, the computer is programmed to model the object in question, to describe its geometry, accurately render its colour, texture and surface features, depict its movements convincingly, and represent perspective views, shadows, reflections, and highlights – hence adopting an obviously “constructionist” approach. The results are synthetic entities and phenomena, software-generated objects and digital effects representing fully articulated, solid and photorealistic realities, which are capable of responding to the environment and interacting with other synthetic entities or external inputs, and which may not necessarily stand in for counterparts existing or experienceable in real life. The ultimate goal is authenticity, understood as a virtual hyper-realism. The possibility of modelling virtual objects, their properties and the phenomena they are involved in, has become an essential requirement not only in meteorology or nuclear physics, but also in biochemistry and computational molecular biology (where computers are essential to reconstruct the human genome), in tectonics, in economics, in architecture (e.g. to create realistic walkthroughs of proposed projects), in astrophysics, in surgical simulations (surgeons explore surgical procedures on computer-modelled patients), and in computerised axial tomography,2 just to mention some of the most obvious applications. Indeed, every area of human knowledge whose models and entities – whether real or theoretical no longer matters – can be translated into the digital language of bits is, and will inevitably be, more and more dependent upon ICT capacity to let us perceive and handle the objects under investigation, as if they were everyday things, pieces on a chess board that can be automatically moved, rotated, mirrored, scaled, magnified, modified, combined and subjected to the most diverse transformations and tests.
It is true that, so far, digital technology has made swifter and deeper advancements in the reproduction of sound than of images and animation – technologically, digital sound and compact disks represent a reasonably mature sector – but it is sufficient to think of the potential offered by CAD and CG (computer graphics) programs, of the evolution of computer animation and sophisticated digital effects in films like Star Wars, Independence Day, The Fifth Element, T...

Table of contents

  1. Cover Page
  2. Title Page
  3. Copyright Page
  4. Preface
  5. Chapter 1: Divide et computa: philosophy and the digital environment
  6. Chapter 2: The digital workshop
  7. Chapter 3: A revolution called Internet
  8. Chapter 4: The digital domain
  9. Chapter 5: Artificial intelligence: a light approach
  10. Conclusion
  11. Notes
  12. Bibliography