Neural Networks and Pattern Recognition
eBook - ePub

Neural Networks and Pattern Recognition

  1. 351 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Neural Networks and Pattern Recognition

About this book

This book is one of the most up-to-date and cutting-edge texts available on the rapidly growing application area of neural networks. Neural Networks and Pattern Recognition focuses on the use of neural networksin pattern recognition, a very important application area for neural networks technology. The contributors are widely known and highly respected researchers and practitioners in the field. - Features neural network architectures on the cutting edge of neural network research - Brings together highly innovative ideas on dynamical neural networks - Includes articles written by authors prominent in the neural networks research community - Provides an authoritative, technically correct presentation of each specific technical area

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Neural Networks and Pattern Recognition by Omid Omidvar,Judith Dayhoff in PDF and/or ePUB format, as well as other popular books in Biological Sciences & Zoology. We have over one million books available in our catalogue for you to explore.

Information

Chapter 1

Pulse-Coupled Neural Networks

J.L. Johnson; H. Ranganath; G. Kuntimad; H.J. Caulfield

ABSTRACT

A pulse-coupled neural network using the Eckhorn linking field coupling [1] is shown to contain invariant spatial information in the phase structure of the output pulse trains. The time domain signals axe directly related to the intensity histogram of an input spatial distribution and have complex phase factors that specify the spatial location of the histogram elements. Two time scales are identified. On the fast time scale the linking produces dynamic, quasi-periodic, fringe-like traveling waves [2] that can carry information beyond the physical limits of the receptive fields. These waves contain the morphological connectivity structure of image elements. The slow time scale is set by the pulse generator, and on that scale the image is segmented into multineuron time-synchronous groups. These groups act as giant neurons, firing together, and by the same linking field mechanism as for the linking waves can form quasi-periodic pulse structures whose relative phases encode the location of the groups with respect to one another. These time signals are a unique, object-specific, and roughly invariant time signature for their corresponding input spatial image or distribution [3].
The details of the model are discussed, giving the basic Eckhorn linking field, extensions, generation of time series in the limit of very weak linking, invariances from the symmetries of the receptive fields, time scales, waves, and signatures. Multirule logical systems are shown to exist on single neurons. Adaptation is discussed. The pulse-coupled nets are compatible with standard nonpulsed adaptive nets rather than competitive with them in the sense that any learning law can be used. Their temporal nature results in adaptive associations in time as well as over space, and they are similar to the time-sequence learning models of Reiss and Taylor [4]. Hardware implementations, optical and electronic, are reviewed. Segmentation, object identification, and location methods are discussed and current results given. The conjugate basic problem of transforming a time signal into a spatial distribution, comparable in importance to the transformation of a spatial distribution into a time signal, is discussed. It maps the invariant time signature into a phase versus frequency spatial distribution and is the spatial representation of the complex histogram. A method of generating this map is discussed. Image pattern recognition using this network is shown to have the power of syntactical pattern recognition and the simplicity of statistical pattern recognition.

1 Introduction

The linking field model of Eckhorn et al. [1] was proposed as a minimal model to explain the experimentally observed synchronous feature-dependent activity of neural assemblies over large cortical distances in the cat cortex [5]. It is a cortical model. It emphasizes synchronizations of oscillatory spindles that occur in the limit of strong linking fields and distinguishes two major types: (1) forced, or stimulus-locked, synchronous activity and (2) induced synchronous activity. Forced activity is produced by abrupt temporal changes such as movement. Induced activity occurs when the pulse train structure of the outputs of groups of cells are similar [6]. The model is called “linking field” because it uses a secondary receptive field’s input to modulate a primary receptive field’s input by multiplication in order to obtain the necessary coupling that links the pulse activity into synchronicity.
This paper is concerned with the behavior of the linking field model in the limit of weak-to-moderate linking strengths [2],[7]. Strong linking is characterized by synchronous bursts of pulses. When the linking strength is reduced, the neurons no longer fire in bursts but still have a high degree of phase and frequency locking. This is the regime of moderate linking strength. Further reduction continuously lowers the degree of linking to a situation where locking can occur only for small phase and frequency differences. This is the weak linking regime. A major result of this research is the finding that in the weak linking regime it is possible to encode spatial input distributions into corresponding temporal patterns with enough structure...

Table of contents

  1. Cover image
  2. Title page
  3. Table of Contents
  4. Copyright page
  5. Preface
  6. Contributors
  7. Chapter 1: Pulse-Coupled Neural Networks
  8. Chapter 2: A Neural Network Model for Optical Flow Computation
  9. Chapter 3: Temporal Pattern Matching Using an Artificial Neural Network
  10. Chapter 4: Patterns of Dynamic Activity and Timing in Neural Network Processing
  11. Chapter 5: A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons
  12. Chapter 6: Finite State Machines and Recurrent Neural Networks — Automata and Dynamical Systems Approaches
  13. Chapter 7: Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error
  14. Chapter 8: Using SONNET 1 to Segment Continuous Sequences of Items
  15. Chapter 9: On the Use of High-Level Petri Nets in the Modeling of Biological Neural Networks
  16. Chapter 10: Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions
  17. Index