
eBook - ePub
Neural Network Modeling
Statistical Mechanics and Cybernetic Perspectives
- 256 pages
- English
- ePUB (mobile friendly)
- Available on iOS & Android
eBook - ePub
Neural Network Modeling
Statistical Mechanics and Cybernetic Perspectives
About this book
Neural Network Modeling offers a cohesive approach to the statistical mechanics and principles of cybernetics as a basis for neural network modeling. It brings together neurobiologists and the engineers who design intelligent automata to understand the physics of collective behavior pertinent to neural elements and the self-control aspects of neurocybernetics. The theoretical perspectives and explanatory projections portray the most current information in the field, some of which counters certain conventional concepts in the visualization of neuronal interactions.
Frequently asked questions
Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Perlego offers two plans: Essential and Complete
- Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
- Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, weāve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere ā even offline. Perfect for commutes or when youāre on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Neural Network Modeling by P. S. Neelakanta,Dolores DeGroff in PDF and/or ePUB format, as well as other popular books in Technology & Engineering & Software Development. We have over one million books available in our catalogue for you to explore.
Information
CHAPTER 1
Introduction
1.1 General
The interconnected biological neurons and the network of their artificial counterparts have been modeled in physioanatomical perspectives, largely via cognitive considerations and in terms of physical reasonings based on statistical mechanics of interacting units. The overall objective of this book is to present a cohesive and comprehensive compendium elaborating the considerations of statistical mechanics and cybernetic principles in modeling real (biological) neurons as well as neuromimetic artificial networks. While the perspectives of statistical mechanics on neural modeling address the physics of interactions associated with the collective behavior of neurons, the cybernetic considerations describe the science of optimal control over complex neural processes. The purpose of this book is, therefore, to highlight the common intersection of statistical mechanics and cybernetics with the universe of the neural complex in terms of associated stochastical attributions.
In the state-of-the-art data-processing systems, neuromimetic networks have gained limited popularity largely due to the fragmentary knowledge of neurological systems which has consistently impeded the realistic mathematical modeling of the associated cybernetics. Notwithstanding the fact that modern information processing hinges on halfway adoption of biological perspectives on neurons, the concordant high-level and intelligent processing endeavors are stretched through the self-organizing architecture of real neurons. Such architectures are hierarchically structured on the basis of interconnection networks which represent the inherent aspects of neuronal interactions.
In order to sway from this pseudo-parasitical attitude, notionally dependent but practically untied to biological realities, the true and total revolution warranted in the application-based artificial neurons is to develop a one-to-one correspondence between artificial and biological networks. Such a handshake would āsmearā the mimicking artificial system with the wealth of complex automata, the associated interaction physics, and the cybernetics of the biological neuronsā in terms of information processing mechanisms with unlimited capabilities.
For decades, lack of in-depth knowledge on biological neurons and the nervous system has inhibited the growth of developing artificial networks in the image of real neurons. More impediments have stemmed from inadequate and/or superficial physicomathematical descriptions of biological systems undermining their total capabilities ā only to be dubbed as totally insufficient for the requirements of advances in modern information processing strategies.
However, if the real neurons and artificial networks are viewed through common perspectives via physics of interaction and principles of cybernetics, perhaps the superficial wedlock between the biological considerations and artificial information processing could be harmonized through a binding matrimony with an ultimate goal of realizing a new generation of massively parallel information processing systems.
This book is organized to elucidate all those strands and strings of biological intricacies and suggest the physicomathematical modeling of neural activities in the framework of statistical mechanics and cybernetic principles. Newer perspectives are projected for the conception of better artificial neural networks more akin to biological systems. In Section 1.2, a broad outline on the state-of-the-art aspects of interaction physics and stochastical perspectives of the neural system is presented. A review on the relevant implications in the information processing is outlined. Section 1.3 introduces the fundamental considerations in depicting the real (or the artificial) neural network via cybernetic principles; and the basics of control and self-control organization inherent to the neural system are indicated. The commonness of various sciences including statistical mechanics and cybernetics in relation to complex neural functions is elucidated in Section 1.4; and concluding remarks are furnished in Section 1.5.
1.2 Stochastical Aspects and Physics of Neural Activity
The physics of neuronal activity, the proliferation of communication across the interconnected neurons, the mathematical modeling of neuronal assembly, and the physioanatomical aspects of neurocellular parts have been the topics of inquisitive research and in-depth studies over the past few decades. The cohesiveness of biological and physical attributions of neurons has been considered in the underlying research to elucidate a meaningful model that portrays not only the mechanism of physiochemical activities in the neurons, but also the information-theoretic aspects of neuronal communication. With the advent of developments such as the electron microscope, microelectrodes, and other signal-processing strategies, it has been facilitated in modem times to study in detail the infrastructure of neurons and the associated (metabolic) physiochemical activities manifesting as measurable electrical signals which proliferate across the interconnected neural assembly.
The dynamics of neural activity and communication/signal-flow considerations together with the associated memory attributions have led to the emergence of so-called artificial neurons and development of neural networks in the art of computational methods.
Whether it is the āreal neuronā or its āartificialā version, the basis of its behavior has been depicted mathematically on a core-criterion that the neurons (real or artificial) represent a system of interconnected units embodied in a random fashion. Therefore, the associated characterizations depict stochastical variates in the sample-space of neural assembly. That is, the neural network depicts inherently a set of implemented local constraints as connection strengths in a stochastical network. The stochastical attributes in a biological neural complex also stem from the fact that neurons may sometimes spontaneously become active without external stimulus or if the synaptic excitation does not exceed the activation threshold. This phenomenon is not just a thermal effect, but may be due to random emission of neurotransmitters at the synapses.
Further, the activities of such interconnected units closely resemble similar physical entities such as atoms and molecules in condensed matter. Therefore, it has been a natural choice to model neurons as if emulating the characteristics analogous to those of interacting atoms and/or molecules; and several researchers have hence logically pursued the statistical mechanics considerations in predicting the neurocellular statistics. Such studies broadly refer to the stochastical aspects of the collective response and the statistically unified activities of neurons viewed in the perspectives of different algorithmic models; each time it has been attempted to present certain newer considerations in such modeling strategies, refining the existing heuristics and portraying better insights into the collective activities via appropriate stochastical descriptions of the neuronal activity.
The subject of stochastical attributions to neuronal sample-space has been researched historically in two perspectives, namely, characterizing the response of a single (isolated) neuron and analyzing the behavior of a set of interconnected neurons. The central theme of research that has been pursued in depicting the single neuron in a statistical framework refers to the characteristics of spike generation (such as interspike interval distribution) in neurons. Significantly, relevant studies enclave the topics on temporal firing patterns analyzed in terms of stochastical system considerations such as random walk theory. For example, Gerstein and Mandelbrot [2] applied the random walk models for the spike activity of a single neuron; and modal analysis of renewal models for the spontaneous single neuron discharges were advocated by Feinberg and Hochman [3]. Further considered in the literature are the markovian attributes of the spike trains [4] and the application of time-series process and power spectral analysis to neuronal spikes [5]. Pertinent to the complexity of neural activity, accurate modeling of a single neuron stochastics has not, however, emerged yet; and continued efforts are still on the floor of research in this intriguing area despite of a number of interesting publications which have surfaced to date. The vast and scattered literature on stochastic models of spontaneous activity in single neurons has been fairly comprised as a set of lecture notes by Sampath and Srinivasan [6].
The statistics of all-or-none (dichotomous) firing characteristics of a single neuron have been studied as logical random bistable considerations. McCulloch and Pitts in 1943 [7] pointed out an interesting isomorphism between the input-output relations of idealized (two-state) neurons and the truth functions of symbolic logic. Relevant analytical aspects have also since then been used profusely in the stochastical considerations of interconnected networks.
While the stochastical perspectives of an isolated neuron formed a class of research by itself, the randomly connected networks containing an arbitrary number of neurons have been studied as a distinct class of scientific investigations with the main objective of elucidating information flow across the neuronal assembly. Hence, the randomness or the entropical aspects of activities in the interconnected neurons and the āself-re-exciting firing activitiesā emulating the memory aspects of the neuronal assembly have provided a scope to consider the neuronal communication as prospective research avenues [8]; and to date the information-theoretic memory considerations and, more broadly, the neural computation analogy have stemmed as the bases for a comprehensive and expanding horizon for an intense research. In all these approaches there is, however, one common denominator, namely, the stochastical attributes with probabilistic considerations forming the basis for any meaningful analytical modeling and mathematical depictions of neuronal dynamics. That is, the global electrical activity in the neuron (or in the interconnected neurons) is considered essentially as a stochastical process.
More intriguingly, the interaction of the neurons (in the statistical sample space) corresponds vastly to the complicated dynamic interactions perceived in molecular or atomic ensembles. Therefore, an offshoot research on neuronal assembly had emerged historically to identify and correlate on a one-to-one basis the collective response of neurons against the physical characteristics of interacting molecules and/or atoms. In other words, the concepts of classical and statistical mechanics; the associated principles of thermodynamics; and the global functions such as the Lagrangian, the Hamiltonian, the total entropy, the action, and the entropy have also become the theoretical tools in the science of neural activity and neural networks. Thus from the times of Wiener [9], Gabor [10], and Griffith [11-14] to the current date, a host of publications has appeared in the relevant literature; however, there are many incomplete strategies in the formulations, several unexplained idealizations, and a few analogies with inconsistencies in the global modeling of neural activities vis-a-vis stochastical considerations associated with the interaction physics.
Within the framework of depicting the neural assembly as a system of interconnected cells, the activities associated with the neurons can be viewed, in general, as a collective stochastical process characterized by a random proliferation of state transitions across the interconnected units. Whether the pertinent modeling of neuronal interaction(s) evolved (conventionally) as analogous to interacting magnetic spins is totally justifiable (if not what is the alternative approach) the question of considering the probabilistic progression of neuronal state by an analogy of momentum flow (in line with particle dynamics) or as being represented by an analog model of wave function, the stochastical modeling of noise-perturbed neural dynamics and informatic aspects considered in the entropy plane of neurocybernetics are the newer perspectives which can be viewed in an exploratory angle through statistical mechanics and cybernetic considerations. A streamline of relevant bases are as follows:
⢠A closer look at the existing analogy between networks of neurons and aggregates of interacting spins in magnetic systems. Evolution of an alternative analogy by considering the neurons as molecular free-point dipoles (as in liquid crystals of nematic phase with a long-range orientational order) to obviate any prevalent inconsistencies of magnetic spin analogy [15].
⢠Identifying the class of orientational anisotropy (or persistent spatial long-range order) in the neural assembly to develop a nonlinear (squashed) input-output relation for a neural cell; and application of relevant considerations in modeling a neural network with a stochastically justifiable sigmoidal function [16].
⢠Viewing the progression of state-transitions across a neuronal assembly (consisting of a large number of interconnected cells each characterized by a dichotomous potential state) as a collective ...
Table of contents
- Cover
- Title Page
- Copyright Page
- Dedication
- Table of Contents
- Chapter 1: Introduction
- Chapter 2: Neural and Brain Complex
- Chapter 3: Concepts of Mathematical Neurobiology
- Chapter 4: Pseudo-Thermodynamics of Neural Activity
- Chapter 5: The Physics of Neural Activity: A Statistical Mechanics Perspective
- Chapter 6: Stochastical Dynamics of the Neural Complex
- Chapter 7: Neural Field Theory: Quasiparticle Dynamics and Wave Mechanics Analogies of Neural Networks
- Chapter 8: Informatie Aspects of Neurocybernetics
- Appendix A: Magnetism and the Ising Spin-Glass Model
- Appendix B: Matrix Methods in Littleās Model
- Appendix C: Overlap of Replicas and Replica Symmetry Ansatz
- Bibliography
- Subject Index