Brain and Nature-Inspired Learning, Computation and Recognition
eBook - ePub

Brain and Nature-Inspired Learning, Computation and Recognition

  1. 788 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Brain and Nature-Inspired Learning, Computation and Recognition

About this book

Brain and Nature-Inspired Learning, Computation and Recognition presents a systematic analysis of neural networks, natural computing, machine learning and compression, algorithms and applications inspired by the brain and biological mechanisms found in nature. Sections cover new developments and main applications, algorithms and simulations. Developments in brain and nature-inspired learning have promoted interest in image processing, clustering problems, change detection, control theory and other disciplines. The book discusses the main problems and applications pertaining to bio-inspired computation and recognition, introducing algorithm implementation, model simulation, and practical application of parameter setting.Readers will find solutions to problems in computation and recognition, particularly neural networks, natural computing, machine learning and compressed sensing. This volume offers a comprehensive and well-structured introduction to brain and nature-inspired learning, computation, and recognition.- Presents an invaluable systematic introduction to brain and nature-inspired learning, computation and recognition- Describes the biological mechanisms, mathematical analyses and scientific principles behind brain and nature-inspired learning, calculation and recognition- Systematically analyzes neural networks, natural computing, machine learning and compression, algorithms and applications inspired by the brain and biological mechanisms found in nature- Discusses the theory and application of algorithms and neural networks, natural computing, machine learning and compression perception

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Brain and Nature-Inspired Learning, Computation and Recognition by Licheng Jiao,Ronghua Shang,Fang Liu,Weitong Zhang in PDF and/or ePUB format, as well as other popular books in Technology & Engineering & Engineering General. We have over one million books available in our catalogue for you to explore.
Chapter 1

Introduction

Abstract

In recent years, brain- and nature-inspired algorithms have emerged endlessly. Neural network, natural computing, machine learning, and compressed sensing have achieved great success in image processing, clustering, and change detection, especially in SAR image segmentation, polarization synthetic aperture radar (SAR) segmentation, and community detection. Brain- and nature-inspired learning computation and recognition are research directions that have attracted wide attention in recent years. They have become the main component of AI 2.0 in China. In this book, theoretical knowledge and specific applications are discussed and analyzed in detail, to meet the dual needs of academic research and engineering innovation. This book systematically discusses the basic theories, algorithms, and applications of neural networks, natural computing, machine learning, and compressed sensing, so as to facilitate further research and exploration by interested readers.

Keywords

Brief introduction; Compressive sensing; Machine learning; Neural network

1.1. A brief introduction to the neural network

Over the years, scientists have been exploring the secrets of the human brain from various perspectives, such as medicine, biology, physiology, philosophy, computer science, cognition, and organization synergetics, hoping to make artificial neurons that simulate the human brain. In the process of research, in recent years, a new multidisciplinary cross-technology field has been formed, called ā€œartificial neural networkā€ The research into neural networks involves a wide range of disciplines, which combine, infiltrate, and promote each other.
Artificial neural network (ANN) is an adaptive nonlinear dynamic system composed of a large number of simple basic elements—neurons. The structure and function of each neuron are relatively simple, but the system behavior produced by a large number of neuron combinations is very complex. The basic structure of an artificial neural network mimics the human brain, and reflects some basic characteristics of human brain function. It can adapt itself to the environment, summarize rules, and complete some operations, recognition, or process control. Artificial neural networks have the characteristics of parallel processing, which can greatly improve work speed.

1.1.1. The development of neural networks

The development of artificial neural networks has gone through three climaxes: control theory from the 1940 to 1960s [1-3], connectionism from the 1980s to the mid-1990s [4, 5], and deep learning since 2006 [6, 7].
In 1943, Warren McCulloch and Walter Pitts based on a mathematical algorithm threshold logic algorithm created a neural network model [8]. This linear model identifies two different types of inputs by testing whether the response output is positive or negative. The study of neural networks is divided into the study of biological processes in the brain and the study of artificial intelligence (artificial neural networks). In 1949, Hebb published Organization of Behavior, and put forward the famous ā€œHebb theoryā€ [2] Hebb theory mainly argues that when the axons of neuron A are close to neuron B and neuron A participates in the repeated and sustained excitement of neuron B, both the neurons or one of them will change the process of growth or metabolism, which can enhance the effectiveness of neuron A stimulating neuron B [9]. Hebb theory was confirmed by Nobel Prize winner Kendall and his animal experiments in 2000 [10]. The later unsupervised machine learning algorithms are the variants of Hebb theory more or less. In 1958, Frank Rosenblatt simulated a neural network model called the ā€œperceptronā€ which was invented on an IBM-704 computer [11]. This model can perform some simple visual processing tasks. Rosenblatt believed that the perceptron would eventually be able to learn, make decisions, and translate languages. In 1959, another two American engineers, Widrow and Hoff [12], put forward the adaptive linear element (Adaline). This was a change from the perceptron and one of the progenitor models of machine learning. The main difference between it and perceptron was that the Adaline neuron has a linear activation function, which allows the output t...

Table of contents

  1. Cover image
  2. Title page
  3. Table of Contents
  4. Copyright
  5. Chapter 1. Introduction
  6. Chapter 2. The models and structure of neural networks
  7. Chapter 3. Theoretical basis of natural computation
  8. Chapter 4. Theoretical basis of machine learning
  9. Chapter 5. Theoretical basis of compressive sensing
  10. Chapter 6. Multiobjective evolutionary algorithm (MOEA)-based sparse clustering
  11. Chapter 7. MOEA-based community detection
  12. Chapter 8. Evolutionary computation-based multiobjective capacitated arc routing optimizations
  13. Chapter 9. Multiobjective optimization algorithm-based image segmentation
  14. Chapter 10. Graph-regularized feature selection based on spectral learning and subspace learning
  15. Chapter 11. Semisupervised learning based on nuclear norm regularization
  16. Chapter 12. Fast clustering methods based on learning spectral embedding
  17. Chapter 13. Fast clustering methods based on affinity propagation and density weighting
  18. Chapter 14. SAR image processing based on similarity measures and discriminant feature learning
  19. Chapter 15. Hyperspectral image processing based on sparse learning and sparse graph
  20. Chapter 16. Nonconvex compressed sensing framework based on block strategy and overcomplete dictionary
  21. Chapter 17. Sparse representation combined with fuzzy C-means (FCM) in compressed sensing
  22. Chapter 18. Compressed sensing by collaborative reconstruction
  23. Chapter 19. Hyperspectral image classification based on spectral information divergence and sparse representation
  24. Chapter 20. Neural network-based synthetic aperture radar image processing
  25. Chapter 21. Neural networks-based polarimetric SAR image classification
  26. Chapter 22. Deep neural network models for hyperspectral images
  27. Index