Deep Learning By Example
eBook - ePub

Deep Learning By Example

Ahmed Menshawy

Compartir libro
  1. English
  2. ePUB (apto para móviles)
  3. Disponible en iOS y Android
eBook - ePub

Deep Learning By Example

Ahmed Menshawy

Detalles del libro
Vista previa del libro
Índice
Citas

Información del libro

Grasp the fundamental concepts of deep learning using Tensorflow in a hands-on mannerAbout This Book• Get a first-hand experience of the deep learning concepts and techniques with this easy-to-follow guide• Train different types of neural networks using Tensorflow for real-world problems in language processing, computer vision, transfer learning, and more• Designed for those who believe in the concept of 'learn by doing', this book is a perfect blend of theory and code examplesWho This Book Is ForThis book targets data scientists and machine learning developers who wish to get started with deep learning. If you know what deep learning is but are not quite sure of how to use it, this book will help you as well. An understanding of statistics and data science concepts is required. Some familiarity with Python programming will also be beneficial.What You Will Learn• Understand the fundamentals of deep learning and how it is different from machine learning• Get familiarized with Tensorflow, one of the most popular libraries for advanced machine learning• Increase the predictive power of your model using feature engineering• Understand the basics of deep learning by solving a digit classification problem of MNIST• Demonstrate face generation based on the CelebA database, a promising application of generative models• Apply deep learning to other domains like language modeling, sentiment analysis, and machine translationIn DetailDeep learning is a popular subset of machine learning, and it allows you to build complex models that are faster and give more accurate predictions. This book is your companion to take your first steps into the world of deep learning, with hands-on examples to boost your understanding of the topic.This book starts with a quick overview of the essential concepts of data science and machine learning which are required to get started with deep learning. It introduces you to Tensorflow, the most widely used machine learning library for training deep learning models. You will then work on your first deep learning problem by training a deep feed-forward neural network for digit classification, and move on to tackle other real-world problems in computer vision, language processing, sentiment analysis, and more. Advanced deep learning models such as generative adversarial networks and their applications are also covered in this book.By the end of this book, you will have a solid understanding of all the essential concepts in deep learning. With the help of the examples and code provided in this book, you will be equipped to train your own deep learning models with more confidence.Style and approachA step-by-step guide filled with multiple examples to help you get started with data science and deep learning.

Preguntas frecuentes

¿Cómo cancelo mi suscripción?
Simplemente, dirígete a la sección ajustes de la cuenta y haz clic en «Cancelar suscripción». Así de sencillo. Después de cancelar tu suscripción, esta permanecerá activa el tiempo restante que hayas pagado. Obtén más información aquí.
¿Cómo descargo los libros?
Por el momento, todos nuestros libros ePub adaptables a dispositivos móviles se pueden descargar a través de la aplicación. La mayor parte de nuestros PDF también se puede descargar y ya estamos trabajando para que el resto también sea descargable. Obtén más información aquí.
¿En qué se diferencian los planes de precios?
Ambos planes te permiten acceder por completo a la biblioteca y a todas las funciones de Perlego. Las únicas diferencias son el precio y el período de suscripción: con el plan anual ahorrarás en torno a un 30 % en comparación con 12 meses de un plan mensual.
¿Qué es Perlego?
Somos un servicio de suscripción de libros de texto en línea que te permite acceder a toda una biblioteca en línea por menos de lo que cuesta un libro al mes. Con más de un millón de libros sobre más de 1000 categorías, ¡tenemos todo lo que necesitas! Obtén más información aquí.
¿Perlego ofrece la función de texto a voz?
Busca el símbolo de lectura en voz alta en tu próximo libro para ver si puedes escucharlo. La herramienta de lectura en voz alta lee el texto en voz alta por ti, resaltando el texto a medida que se lee. Puedes pausarla, acelerarla y ralentizarla. Obtén más información aquí.
¿Es Deep Learning By Example un PDF/ePUB en línea?
Sí, puedes acceder a Deep Learning By Example de Ahmed Menshawy en formato PDF o ePUB, así como a otros libros populares de Computer Science y Neural Networks. Tenemos más de un millón de libros disponibles en nuestro catálogo para que explores.

Información

Año
2018
ISBN
9781788395762
Edición
1
Categoría
Neural Networks

TensorFlow in Action - Some Basic Examples

,In this chapter, we will explain the main computational concept behind TensorFlow, which is the computational graph model, and demonstrate how to get you on track by implementing linear regression and logistic regression.
The following topics will be covered in this chapter:
  • Capacity of a single neuron and activation functions
  • Activation functions
  • Feed-forward neural network
  • The need for a multilayer network
  • TensorFlow terminologies—recap
  • Linear regression model—building and training
  • Logistic regression model—building and training
We will start by explaining what a single neuron can actually do/model, and based on this, the need for a multilayer network will arise. Next up, we will do more elaboration of the main concepts and tools that are used/available within TensorFlow and how to use these tools to build up simple examples such as linear regression and logistic regression.

Capacity of a single neuron

A neural network is a computational model that is mainly inspired by the way the biological neural networks of the human brain process the incoming information. Neural networks made a huge breakthrough in machine learning research (deep learning, specifically) and industrial applications, such as breakthrough results in computer vision, speech recognition, and text processing. In this chapter, we will try to develop an understanding of a particular type of neural network called the multi-layer Perceptron.

Biological motivation and connections

The basic computational unit of our brains is called a neuron, and we have approximately 86 billion neurons in our nervous system, which are connected with approximately
to
synapses.
Figure 1 shows a biological neuron. Figure 2 shows the corresponding mathematical model. In the drawing of the biological neuron, each neuron receives incoming signals from its dendrites and then produces output signals along its axon, where the axon gets split out and connects via synapses to other neurons.
In the corresponding mathematical computational model of a neuron, the signals that travel along the axons
interact with a multiplication operation
with the dendrites of the other neuron in the system based on the synaptic strength at that synapse, which is represented by
. The idea is that the synaptic weights/strength
gets learned by the network and they're the ones that control the influence of a specific neuron on another.
Also, in the basic computational model in Figure 2, the dendrites carry the signal to the main cell body where it sums them all. If the final result is above a certain threshold, the neuron can fire in the computational model.
Also, it is worth mentioning that we need to control the frequency of the output spikes along the axon, so we use something called an activation function. Practically, a common choice of activation function is the sigmoid function σ, since it takes a real-valued input (the signal strength after the sum) and squashes it to be between 0 and 1. We will see the details of these activation functions later in the following section:
Figure 1: Computational unit of the brain (http://cs231n.github.io/assets/nn1/neuron.png)
There is the corresponding basic mathematical model for the biological one:
Figure 2: Mathematical modeling of the Brain's computational unit (http://cs231n.github.io/assets/nn1/neuron_model.webp)
The basic unit of computation in a neural network is the neuron, often called a node or unit. It receives input from some other nodes or from an external source and computes an output. Each input has an associated weight (w), which is assigned on the basis of its importance relative to other inputs. The node applies a function f (we've defined it later) to the weighted sum of its inputs.
So, the basic computational unit of neural networks in general is called neuron/node/unit.
This neuron receives its input from previous neurons or even an external source and then it does some processing on this input to produce a so-called activation. Each input to this neuron is associated with its own weight
, which represents the strength of this connection and hence the importance of this input.
So, the final output of this basic building block of the neural network is a summed version of the inputs weighted by their importance w, and then the neuron passes the summed output through an activation function.
Figure 3: A single neuron

Activation functions

The output from the neuron is computed as shown in Figure 3, and passed through an activation function that introduces non-linearity to the output. This f is called an activation function. The main purposes of the activation functions are to:
  • Introduce nonlinearity into the output of a neuron. This is important because most real-world data is nonlinear and we want neurons to learn these nonlinear representations.
  • Squash the output to be in a specific range.
Every activation function (or nonlinearity) takes a single number and performs a certain fixed mathematical operation on it. There are several activation functions you may encounter in practice.
So, we are going to briefly cover the most common activation functions.

Sigmoid

Historically, the sigmoid activation function is widely used among researchers. This function accepts a ...

Índice