Computer Science

Complexity Theory

Complexity theory in computer science studies the resources required to solve computational problems, such as time and space. It explores the inherent difficulty of problems and classifies them based on their computational complexity. This field aims to understand the limits of efficient computation and the relationships between different types of problems.

Written by Perlego with AI-assistance

9 Key excerpts on "Complexity Theory"

  • Book cover image for: Introduction to Theory of Computation, An
    ________________________ WORLD TECHNOLOGIES ________________________ Chapter 3 Computational Complexity Theory Computational Complexity Theory is a branch of the theory of computation in theore-tical computer science and mathematics that focuses on classifying computational prob-lems according to their inherent difficulty. In this context, a computational problem is understood to be a task that is in principle amenable to being solved by a computer. Informally, a computational problem consists of problem instances and solutions to these problem instances. For example, primality testing is the problem of determining whether a given number is prime or not. The instances of this problem are natural numbers, and the solution to an instance is yes or no based on whether the number is prime or not. A problem is regarded as inherently difficult if solving the problem requires a large amount of resources, whatever the algorithm used for solving it. The theory formalizes this intuition, by introducing mathematical models of computation to study these prob-lems and quantifying the amount of resources needed to solve them, such as time and storage. Other complexity measures are also used, such as the amount of communication (used in communication complexity), the number of gates in a circuit (used in circuit complexity) and the number of processors (used in parallel computing). One of the roles of computational Complexity Theory is to determine the practical limits on what com-puters can and cannot do. Closely related fields in theoretical computer science are analysis of algorithms and computability theory. A key distinction between computational Complexity Theory and analysis of algorithms is that the latter is devoted to analyzing the amount of resources needed by a particular algorithm to solve a problem, whereas the former asks a more general question about all possible algorithms that could be used to solve the same problem.
  • Book cover image for: Foundations of Perceptual Theory
    After all, the worst case does occur in practice as well. This approach to the problem of search diverges from that of the psychologist, physicist, or engineer. In the same way that the laws of thermodynamics provide theoretical limits on the utility and function of nuclear power plants, Complexity Theory provides theoretical limits on information processing systems. If biological vision can indeed be computationally modeled, then Complexity Theory is a natural tool for investigating the information pro- cessing characteristics of both computational and biological vision sys- tems. If the results of these analyses provide deeper insights into the problem and yield verifiable predictions, this would constitute evidence in favor of the computational hypothesis. Using Complexity Theory, one can ask for a given computational prob- lem C, how well, or at what cost can it be solved? More specifically, the followingquestions can be posed: (1) Are there efficient algorithms for C? (2) Can lower bounds be found for the inherent complexity of C? (3) Are there exact solutions for C? 265 (4) What algorithms yield approximate solutions for C? (5) What is the worst-case complexity of C? (6) What is the average complexity of C? Before studying complexity one must define an appropriate complexity measure. Several measures are possible, but the common ones are related to the space requirements (numbers of memory or processor elements) and time requirements (how long it takes to execute) for solving a problem. Com- plexity measures in general deal with the cost of achieving solutions. Complexity Theory begins with a 1937 paper in which the British mathematician Alan Turing introduced his well-known Turing Machine, providing a formalization of the notion of an algorithmically computable function.
  • Book cover image for: Multi-Chaos, Fractal and Multi-Fractional Artificial Intelligence of Different Complex Systems
    • Yeliz Karaca, Dumitru Baleanu, Yu-Dong Zhang, Osvaldo Gervasi, Majaz Moonis(Authors)
    • 2022(Publication Date)
    • Academic Press
      (Publisher)
    68 ].
    The notion of complexity in different areas has many dimensions. To start with, mathematics is concerned with the study of arbitrarily general abstract systems and a very high level of complexity in the behavior of many systems which have rules that are actually simpler than the rules of most systems in traditional sense. It can be said that the traditional mathematical approach to science has contributed to physics, as another area, and it is nearly acknowledged universally that physical theory is to be based on mathematical equations. In theoretical physics, existing methods are around continuous numbers and calculus, probability as well at some times. Nevertheless, a greater simplicity in that structure yields the identification of new phenomena. In computer science, computational systems established to carry out specific tasks have been the focal point, and within this purpose, even the simplest construction is capable of yielding a behavior that is immensely complex. Computational ideas, in this sense, can include all kinds of core questions regarding mathematics and nature. As another field, biology encompasses vast and profound details about living organisms and biological elements, with evolution by natural selection being one of the most classical realms thereof since general observations on living systems are customarily analyzed based on evolutionary history instead of abstract theories. Social sciences, varying from psychology to economics, philosophy and sociology, also offer complexity with the ever changing, adapting, and evolving features over time as a function of people's preferences and attitudes. Although physical sciences require the formulation of solid theories in terms of equations and numbers, social complexity reflects behaviors of humans as ongoing and broader as a result of complicated conditions of individual and group existence through many different arrangements, patterns and movements. For philosophy, on the other hand, issues regarding the universe and the role of human beings therein, besides the uniqueness of humans' conditions, limit to knowledge and the inevitable position of mathematics are positioned at the core [69 ]. Engineering, as another discipline, has its obvious association with complexity, which also shows that even simple underlying rules can be put into practice to carry out a sophisticated task. This means construction of a system with complicated basic rules is not always required in engineering. This is because for the design and operation of engineering systems, the aim is to reduce complexity so that the system can be rendered robust, which assures long-term stability, system reliability and cost minimization [39 ]. Rather than the classical approach of “dividing and conquering,” complexity engineering tackles adaptive, self-managing, self-organizing and emergent features [73
  • Book cover image for: Software Engineering Foundations
    eBook - PDF

    Software Engineering Foundations

    A Software Science Perspective

    According to cognitive informatics, human beings may comprehend a large cycle of iteration, which is the major issue of computational complexity, by looking at only the beginning and termination conditions, and one or a few arbitrary internal loops with inductive inferences. However, humans are not good at dealing with functional complexities such as a long chain of interrelated operations, very abstract data objects, and their consistency. Therefore, the system complexity of large-scale software is the focus of software engineering. 10.7.1 COMPUTATIONAL COMPLEXITY Computational Complexity Theory is a well established area in computing [Hartmanis and Stearns, 1965; Hartmanis, 1994; Lewis and Papadimitriou, 1998] that studies: a) The taxonomy of problems in The 36th Principle of Software Engineering Theorem 10.13 The orientation of software engineering complexity theories states that the complexity theories of computation and software engineering are different. The former is focused on the problems of high throughput complexity that are computing time efficiency centered; while the latter puts emphases on the problems of functional complexity that are human cognition time and workload oriented. Chapter 10 System Science Foundations of SE 815 computing and their solvabilities; and b) Complexities and efficiencies of algorithms for a given problem. Computational complexity centered by the algorithm complexity can be modeled by its time or space complexity, particularly the former, proportional to the sizes of problems. 10.7.1.1 Taxonomy of Computational Problems Computational complexity theories study the solvability in computing. The solvable problems are those that can be computed by polynomial-time consumption. The nonsolvable problems are those that cannot be solved in any practical sense by computers due to excessive time requirements. The taxonomy of problems in computation can be classified into the following classes.
  • Book cover image for: Designing with Multi-Agent Systems
    eBook - ePub

    Designing with Multi-Agent Systems

    A Computational Methodology for Form-Finding Using Behaviors

    • Evangelos Pantazis(Author)
    • 2024(Publication Date)
    • De Gruyter
      (Publisher)
    In the following sections, a brief historical overview of the evolution of the term is provided, and the underlying principles of complexity are described in order to better understand the term. Based on this analysis, a taxonomy of different types of complexity is devised, and measures developed to manage it within the contemporary architectural context.
    2.2.1  Theoretical framework for approaching design complexity
    Everyday language has included terms for complexity since antiquity; however, the idea of treating it as a coherent scientific concept is quite new [1 ]. Nonetheless, in the late nineteenth century, scientific progress supported by the technological advancements brought by the industrial revolution questioned the linearity and reductionism of the Newtonian paradigm, which existed in traditional sciences, such as mathematics, biology and physics [1 , 78 ]. The establishment of new theories in the twentieth century provided researchers with new tools for studying how living organisms evolve (i.e., molecular biology) and how CASs behave (i.e., a beehive) and put complexity in the scientific landscape [103]. In the 1930s, Alan Turing was the first to associate complexity with the amount of information needed to describe a process, offering a different perspective [79 ]. This led Shannon to the formulation of IT in the 1940s by associating the amount of information exchange between the feedback mechanisms of different systems for the accomplishment of a given task [80 ]. In 1950, Bertalanffy [81 ] introduced GST, which dealt with systems holistically and considered their complexity in relation to the number of their parts and their relationships.
    Jon Von Neumann [82 ] mathematically described the logic and structure of automata and considered communication systems as stochastic processes for solving complex problems. In the 1960s, Wiener introduced cybernetics, and focused on analyzing the complex behaviors between systems that operate across multiple domains such as biology, physics and architecture [96]. From the 1970s onward, Complexity Theory started to formalize as a separate discipline due to the incapacity of existing models to explain how biological organisms and CASs function [77 ]. In more recent years, the emerging field of software engineering and systems management brought about an interest in defining different types and measures of complexity [83 , 84
  • Book cover image for: Symmetry And Complexity: The Spirit And Beauty Of Nonlinear Science
    eBook - PDF
    Symmetry and Complexity in Computer Sciences 289 of the halting problem. There is no omnipotent computer to decide all problems and to prove all truths. But below these theoretical limitations many automatical decisions and proofs with more or less degree of computational complexity are possible. A formal axiomatic theory which describes a physical, biological, or social system has the great advantage of compressing a lot of theo-rems into a set of a few axioms. Thus, it delivers a shorter description of mathematical truth. Even a physical theory can be understood as a shorter description of many empirical data. In general, a formal theory can be considered a computer program that calculates true theorems or data. The smaller the program is, relative to the output, the better the theory. Obviously, besides running time, the size of a computer program is an important measure of computational com-plexity. As a program is a finite list of symbols, its length can be measured by its number of symbols in binary coding. For example, consider the following sequences of binary digits: s 1 = 111111111111111111 s 2 = 010101010101010101 s 3 = 011010001101110100 For s 1 and s 2 , there are shorter descriptions or printing programs than the actual output: “14 times 1” for s 1 and “8 times 01” for s 2 . But for s 3 , there seems to be no shorter description than the actual output itself. G.J. Chaitin and Kolmogorov came up with the idea that the algorithmic complexity of a symbolic s sequence should be defined by the length of the shortest computer program for generating s (measured in bits) [7.19]. Algorithmic complexity is sometimes called the algorithmic information content of a symbolic sequence, which is the subject of the algorithmic information theory. As random sequences have no regularities, they cannot be described by shorter programs. They are incompressible with an algorithmic complexity equivalent to their length.
  • Book cover image for: Complexity Theory and the Philosophy of Education
    While in the social sciences it was pioneered in economics (Holland, 1987; Arthur, 1989, 1990), Complexity Theory was otherwise as little as ten years ago a relative stranger to the social sciences. Complexity Theory is, as Morrison (2002, p. 6) puts it, ‘a theory of survival, evolution, development and adaptation’. It concerns itself with environments, organisations, or systems that are complex in the sense that very large numbers of constituent elements or agents are connected to and interacting with each other in many different ways. These constituent elements or agents might be atoms, molecules, neurons, human agents, institutions, corporations, etc . Whatever the nature of these constituents, the system is characterised by a continual organisation and re-organisation of and by these constituents into larger structures through the clash of mutual accommodation and mutual rivalry. Thus, molecules would form cells, neurons would form brains, species would form ecosystems, consumers and corporations would form economies, and so on. At each level, new emergent structures would form and engage in new emergent behaviours. Complexity, in other words, [is] really a science of emergence. (Waldrop, 1993, p. 88) Complexity is of course inherently systemic in nature. But the connotations in the commonly associated term, ‘dynamical systems theory’, should already indicate to the reader that it will not be susceptible to accusations of a-historical, static, de-contextualised, functionalist—and, by implication, conservative—analytic perspectives. As Byrne (1998, p. 51) reminds us, ‘What is crucially important about [complexity] is that it is systemic without being conservative. On the contrary, the dynamics of complex systems are inherently dynamic and transformational’.
  • Book cover image for: Managing Complexity of Information Systems
    eBook - ePub
    • Pirmin P. Lemberger, Mederic Morel(Authors)
    • 2013(Publication Date)
    • Wiley-ISTE
      (Publisher)
    Chapter 2Complexity, Simplicity,and Abstraction      
    Recursion is the root of computation since it tradesdescription for time .
    Alan Jay Perlis — Epigrams on Programming  

    2.1. What does information theory tell us?

    We start our journey through complexity and simplicity concepts with mathematics or, more precisely, information theory. This might seem an exotic topic if what we have in mind are applications to the IT world. However, concepts that will be at the core of our future preoccupations, information, randomness, and especially complexity have all been under close scrutiny by mathematicians for over more than half a century now. In their hands, these concepts have evolved into a set of ideas, which is both deep and robust. Moreover, information theory is actually one of those few areas where mathematics succeeded in rigorously formalizing imprecise, almost philosophical concepts, such as complexity and information, to which they bring a unique insight. It would thus seem unreasonable for us to overlook this body of knowledge altogether. These information theory concepts form a collection of metaphors that will help us build a healthy intuition that will prove helpful later when we venture into less rigorous but more practical IT concepts. As we shall see, this first look at the subject, through mathematical glasses, also highlights a number of important issues and limitations, which occur as soon as one seriously attempts to define complexity.
    As information theory is a highly technical and abstract topic, we can barely afford here to do more than just scratch the surface. We shall strive to present in plain language the major findings in information theory of relevance to us. The interested reader will find more details in Appendix 1 .
    Our quick overview of information theory will focus on only three concepts: Shannon's entropy , K-complexity , and Bennett's logical depth . Assume for simplicity's sake that any object or system, whose complexity we wish to define, is described by a binary sequence s such as 001101110… The three concepts mentioned above have one important point in common: they all evaluate the complexity of a system as the quantity of information that its description s contains, assuming that we have a specific goal in mind for s . This goal, as we shall see, is a foreshadowing, in the restricted mathematical context, of the concept of value that we shall examine in Chapter 3
  • Book cover image for: Applying Complexity Theory
    eBook - PDF

    Applying Complexity Theory

    Whole Systems Approaches to Criminal Justice and Social Work

    • Pycroft, Aaron, Bartollas, Clemens(Authors)
    • 2014(Publication Date)
    • Policy Press
      (Publisher)
    15 ONE Complexity Theory: an overview Aaron Pycroft Introduction A whole range of physical, biological, psychological and social systems constitute our lives, largely influencing how, when and where we are born, what the quality of our lives and lived experiences will be, and ultimately how we will die, and what we leave behind. Of course, in our efforts to survive and flourish, we have a tendency to try and reduce uncertainty to provide us with at least the illusion of control as we try to navigate the multitude of systems in which we live, leading to an innately reductionist approach (see Jennings, Chapter Two). The same applies if we work in public services such as criminal justice or social work, when we are probably more used to thinking about systems in the formal sense of partnership/multi-agency working, or team work for example; but, even then, what is the extent of our observation and understanding of what constitutes systems or the contribution that we make to the overarching, under-arching or whole systems of criminal justice or social work? What is the totality of our contribution to the outcomes from those systems, not just for the people that we directly work with, but for those from whom we are further removed or who are unknown to us, and, indeed, how would we know what our impact has been, and could it be measured in any meaningful way? In asking these questions and in attempting to provide a framework for answering them, one of the key arguments of Complexity Theory is that we need to understand that, as individuals, we are constitutive components of the various systems (as individual human beings, we are also a system) that we live and work in, whether we are conscious of it or not; furthermore, we need to understand that we impact upon the behaviour of that whole system, including that which lies beyond our immediate and observable environs.
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.