Technology & Engineering
Eigenvector
An eigenvector is a vector that remains in the same direction after a linear transformation is applied to it. In the context of technology and engineering, eigenvectors are used in various applications such as image processing, data compression, and machine learning algorithms. They are important for understanding the behavior of systems and for solving complex mathematical problems.
Written by Perlego with AI-assistance
Related key terms
1 of 5
12 Key excerpts on "Eigenvector"
- No longer available |Learn more
- (Author)
- 2014(Publication Date)
- Learning Press(Publisher)
________________________ WORLD TECHNOLOGIES ________________________ Chapter- 6 Eigenvalue, Eigenvector & Eigenspace In mathematics, eigenvalue , Eigenvector and eigenspace are related concepts in the field of linear algebra. The prefix eigen-is adopted from the German word eigen for innate, idiosyncratic, own. Linear algebra studies linear transformations, which are represented by matrices acting on vectors. Eigenvalues, Eigenvectors and eigenspaces are properties of a matrix. They are computed by a method described below, give important information about the matrix, and can be used in matrix factorization. They have applications in areas of applied mathematics as diverse as economics and quantum mechanics. In general, a matrix acts on a vector by changing both its magnitude and its direction. However, a matrix may act on certain vectors by changing only their magnitude, and leaving their direction unchanged (or possibly reversing it). These vectors are the Eigenvectors of the matrix. A matrix acts on an Eigenvector by multiplying its magnitude by a factor, which is positive if its direction is unchanged and negative if its direction is reversed. This factor is the eigenvalue associated with that Eigenvector. An eigenspace is the set of all Eigenvectors that have the same eigenvalue, together with the zero vector. These concepts are formally defined in the language of matrices and linear trans-formations. Formally, if A is a linear transformation, a non-null vector x is an Eigenvector of A if there is a scalar λ such that The scalar λ is said to be an eigenvalue of A corresponding to the Eigenvector x . Overview In linear algebra, there are two kinds of objects: scalars, which are just numbers; and vectors, which can be thought of as arrows, and which have both magnitude and direction (though more precisely a vector is a member of a vector space). - No longer available |Learn more
- (Author)
- 2014(Publication Date)
- Library Press(Publisher)
An important benefit of knowing the Eigenvectors and values of a system is that the effects of the action of the matrix on the system can be predicted. Each application of the matrix to an arbitrary vector yields a result which will have rotated towards the Eigenvector with the largest eigenvalue. Many kinds of mathematical objects can be treated as vectors: ordered pairs, functions, harmonic modes, quantum states, and frequencies are examples. In these cases, the ________________________ WORLD TECHNOLOGIES ________________________ concept of direction loses its ordinary meaning, and is given an abstract definition. Even so, if this abstract direction is unchanged by a given linear transformation, the prefix eigen is used, as in eigenfunction , eigenmode , eigenface , eigenstate , and eigen-frequency . Mathematical definition Definition Given a linear transformation A , a non-zero vector x is defined to be an Eigenvector of the transformation if it satisfies the eigenvalue equation for som e scalar λ. In this situation, the scalar λ is called an eigenvalue of A corresponding to the Eigenvector x . The key equation in this definition is the eigenvalue equation, A x = λ x . The vector x has the property that its direction is not changed by the transformation A , but that it is only scaled by a factor of λ. Most vectors x will not satisfy such an equation: a typical vector x changes direction when acted on by A , so that A x is not a scalar multiple of x . This means that only certain special vectors x are Eigenvectors, and only certain special scalars λ are eigenvalues. Of course, if A is a multiple of the identity matrix, then no vector changes direction, and all non-zero vectors are Eigenvectors. Fig. 2. A acts to stretch the vector x , not change its direction, so x is an Eigenvector of A - No longer available |Learn more
- (Author)
- 2014(Publication Date)
- Learning Press(Publisher)
These vectors are the Eigenvectors of the matrix. A matrix acts on an Eigenvector by multiplying its magnitude by a factor, which is positive if its direction is unchanged and negative if its direction is reversed. This factor is the eigenvalue associated with that Eigenvector. An eigenspace is the set of all Eigenvectors that have the same eigenvalue, together with the zero vector. These concepts are formally defined in the language of matrices and linear trans-formations. Formally, if A is a linear transformation, a non-null vector x is an Eigenvector of A if there is a scalar λ such that The scalar λ is said to be an eigenvalue of A corresponding to the Eigenvector x . Overview In linear algebra, there are two kinds of objects: scalars, which are just numbers; and vectors, which can be thought of as arrows, and which have both magnitude and direction (though more precisely a vector is a member of a vector space). In place of the ordinary functions of algebra, the most important functions in linear algebra are called linear transformations, and a linear transformation is usually given by a matrix, an array of numbers. Thus instead of writing f ( x ) we write M ( v ) where M is a matrix and v is a vector. The rules for using a matrix to transform a vector are given in linear algebra. If the action of a matrix on a (nonzero) vector changes its magnitude but not its direction, then the vector is called an Eigenvector of that matrix. Each Eigenvector is, in effect, multiplied by a scalar, called the eigenvalue corresponding to that Eigenvector. The eigenspace corresponding to one eigenvalue of a given matrix is the set of all eigen-vectors of the matrix with that eigenvalue. An important benefit of knowing the Eigenvectors and values of a system is that the effects of the action of the matrix on the system can be predicted. Each application of the matrix to an arbitrary vector yields a result which will have rotated towards the Eigenvector with the largest eigenvalue. - No longer available |Learn more
- (Author)
- 2014(Publication Date)
- Orange Apple(Publisher)
Vibration analysis Eigenvalue problems occur naturally in the vibration analysis of mechanical structures with many degrees of freedom. The eigenvalues are used to determine the natural frequencies (or eigenfrequencies ) of vibration, and the Eigenvectors determine the shapes of these vibrational modes. The orthogonality properties of the Eigenvectors allows decoupling of the differential equations so that the system can be represented as linear summation of the Eigenvectors. The eigenvalue problem of complex structures is often solved using finite element analysis. 1st lateral bending ________________________ WORLD TECHNOLOGIES ________________________ Eigenfaces Fig. 9. Eigenfaces as examples of Eigenvectors In image processing, processed images of faces can be seen as vectors whose components are the brightnesses of each pixel. The dimension of this vector space is the number of pixels. The Eigenvectors of the covariance matrix associated with a large set of normalized pictures of faces are called eigenfaces ; this is an example of principal components analysis. They are very useful for expressing any face image as a linear combination of some of them. In the facial recognition branch of biometrics, eigenfaces provide a means of applying data compression to faces for identification purposes. Research related to eigen vision systems determining hand gestures has also been made. Similar to this concept, eigenvoices represent the general direction of variability in human pronunciations of a particular utterance, such as a word in a language. Based on a linear combination of such eigenvoices, a new voice pronunciation of the word can be constructed. These concepts have been found useful in automatic speech recognition systems, for speaker adaptation. Tensor of inertia In mechanics, the Eigenvectors of the inertia tensor define the principal axes of a rigid body. - Howard Anton, Chris Rorres(Authors)
- 2014(Publication Date)
- Wiley(Publisher)
293 C H A P T E R 5 Eigenvalues and Eigenvectors CHAPTER CONTENTS 5.1 Eigenvalues and Eigenvectors 293 5.2 Diagonalization 301 5.3 Complex Vector Spaces 310 5.4 Differential Equations 321 INTRODUCTION In this chapter we will focus on classes of scalars and vectors known as “eigenvalues” and “Eigenvectors,” terms derived from the German word eigen, meaning “own,” “peculiar to,” “characteristic,” or “individual.” The underlying idea first appeared in the study of rotational motion but was later used to classify various kinds of surfaces and to describe solutions of certain differential equations. In the early 1900s it was applied to matrices and matrix transformations, and today it has applications in such diverse fields as computer graphics, mechanical vibrations, heat flow, population dynamics, quantum mechanics, and economics to name just a few. 5.1 Eigenvalues and Eigenvectors In this section we will define the notions of “eigenvalue” and “Eigenvector” and discuss some of their basic properties. Definition of Eigenvalue and Eigenvector We begin with the main definition in this section. DEFINITION 1 If A is an n × n matrix, then a nonzero vector x in R n is called an Eigenvector of A (or of the matrix operator T A ) if Ax is a scalar multiple of x; that is, Ax = λx for some scalar λ. The scalar λ is called an eigenvalue of A (or of T A ), and x is said to be an Eigenvector corresponding to λ. The requirement that an eigen- vector be nonzero is imposed to avoid the unimportant case A0 = λ0, which holds for ev- ery A and λ. In general, the image of a vector x under multiplication by a square matrix A differs from x in both magnitude and direction. However, in the special case where x is an Eigenvector of A, multiplication by A leaves the direction unchanged. For example, in R 2 or R 3 multiplication by A maps each Eigenvector x of A (if any) along the same line through the origin as x.- Ron Larson(Author)
- 2017(Publication Date)
- Cengage Learning EMEA(Publisher)
Eigenvector Eigenvalues and Eigenvectors have many important applications, many of which are discussed throughout this chapter. For now, you will consider a geometric interpretation of the problem in R 2 . If λ is an eigenvalue of a matrix A and x is an Eigenvector of A corresponding to λ , then multiplication of x by the matrix A produces a vector λ x that is parallel to x , as shown below. x λ x λ A x = x , λ 0 < x λ x λ A x = x , λ < 0 Note that an eigen vector cannot be zero. Allowing x to be the zero vector would render the definition meaningless, because A 0 = λ 0 is true for all real values of λ . An eigen value of λ = 0, however, is possible. (See Example 2.) Definitions of Eigenvalue and Eigenvector Let A be an n × n matrix. The scalar λ is an eigenvalue of A when there is a nonzero vector x such that A x = λ x . The vector x is an Eigenvector of A corresponding to λ . REMARK Only Eigenvectors of real eigenvalues are presented in this chapter. Copyright 2018 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. WCN 02-300 7.1 Eigenvalues and Eigenvectors 349 A matrix can have more than one eigenvalue, as demonstrated in Examples 1 and 2. Verifying Eigenvectors and Eigenvalues For the matrix A = bracketleft.alt2 2 0 0 -1 bracketright.alt2 verify that x 1 = ( 1, 0 ) is an Eigenvector of A corresponding to the eigenvalue λ 1 = 2, and that x 2 = ( 0, 1 ) is an Eigenvector of A corresponding to the eigenvalue λ 2 = -1. SOLUTION Multiplying x 1 on the left by A produces A x 1 = bracketleft.alt2 2 0 0 -1 bracketright.alt2bracketleft.alt2 1 0 bracketright.alt2 = bracketleft.alt2 2 0 bracketright.alt2 = 2 bracketleft.alt2 1 0 bracketright.alt2 . Eigenvalue Eigenvector So, x 1 = ( 1, 0 ) is an Eigenvector of A corresponding to the eigenvalue λ 1 = 2.- eBook - PDF
- David S. Watkins(Author)
- 2004(Publication Date)
- Wiley-Interscience(Publisher)
5 Eigenvalues and Eigenvectors I Eigenvalues and Eigenvectors turn up in stability theory, theory of vibrations, quantum mechanics, statistical analysis, and many other fields. It is therefore important to have efficient, reliable methods for computing these objects. The main business of this chapter is to develop such algorithms, culminating in the powerful and elegant QR algorithm. 1 Before we embark on the development of algorithms, we take the time to illustrate (in Section 5.1) how eigenvalues and Eigenvectors arise in the analysis of systems of differential equations. The material is placed here entirely for motivational purposes. It is intended to convince you, the student, that eigenvalues are important. Section 5.1 is not, strictly speaking, a prerequisite for the rest of the chapter. Section 5.1 also provides an opportunity to introduce MATLAB's eig command. When you use eig to compute the eigenvalues and Eigenvectors of a matrix, you are using the QR algorithm. 5.1 SYSTEMS OF DIFFERENTIAL EQUATIONS Many applications of eigenvalues and Eigenvectors arise from the study of systems of differential equations. 'The QR algorithm should not be confused with the QR decomposition, which we studied extensively in Chapter 3. As we shall see, the QR algorithm is an iterative procedure that performs QR decompositions repeatedly. 289 290 EIGENVALUES AND EigenvectorS I Fig. 5.7 Solve for the time-varying loop currents. Example 5.1.1 The electrical circuit in Figure 5.1 is the same as the one that was featured in Example 1.2.8, except that two inductors and a switch have been added. Whereas resistors resist current, inductors resist changes in current. If we are studying constant, unvarying currents, as in Example 1.2.8, we can ignore the inductors, since their effect is felt only when the currents are changing. However, if the currents are varying in time, we must take the inductances into account. Once the switch in the circuit is closed, current will begin to flow. - eBook - PDF
Linear Algebra: Gateway to Mathematics
Second Edition
- Robert Messer(Author)
- 2021(Publication Date)
- American Mathematical Society(Publisher)
This can be rephrased as saying that the linear map defined in terms of multiplication by the matrix ? has an Eigenvector 𝒔 associated with the eigenvalue 1 . The theory of differential equations makes extensive use of eigen-values and Eigenvectors. As a simple example that hints at this inter-play, let 𝔻 (∞) be the subspace of 𝔽 ℝ of all infinitely differentiable func-tions, and consider the differentiation operator ? ∶ 𝔻 (∞) → 𝔻 (∞) . The formula ?? ? = ? ? tells us that the exponential function is an Eigenvector of ? associated with the eigenvalue 1 . The formula ?? 5? = 5? 5? tells us that the function ? 5? is an Eigenvector of ? associated with the eigen-value 5 . 8.1. Definitions 335 The study of linear operators on infinite-dimensional spaces is an important part of mathematics known as functional analysis. Nevertheless, in this text we will be primarily concerned with finding eigenvalues and Eigenvectors of linear operators that are defined in terms of matrix multiplication. Thus, we will concentrate on linear operators defined on finite-dimensional vector spaces. The following theorem serves as a first step in this direction. 8.2 Theorem: Suppose ? is an ?×? matrix and ? ∶ ℝ ? → ℝ ? is defined by ?(?) = ?? . Then the real number 𝜆 is an eigenvalue of ? if and only if det(𝜆? − ?) = 0 . Proof: Rather than provide separate proofs for the two directions of the implication in the statement of this theorem, we can use the machinery we have developed to put together a string of logically equivalent statements: 𝜆 is an eigenvalue of ? ⟺ ?(?) = 𝜆? for some nonzero ? ∈ ℝ ? ⟺ ?? = 𝜆? for some nonzero ? ∈ ℝ ? ⟺ 𝜆? − ?? = 𝟎 for some nonzero ? ∈ ℝ ? ⟺ (𝜆? − ?)? = 𝟎 for some nonzero ? ∈ ℝ ? ⟺ det(𝜆? − ?) = 0 where the last step follows from the equivalence of conditions g and h of Theorem 7.13. - eBook - PDF
- Pease(Author)
- 1964(Publication Date)
- Academic Press(Publisher)
CHAPTER 111 Eigenualues and Eigenuectors We are now ready to begin the study of the possible behavior of a matrix. The concept that we shall take as central for this purpose is the eigenvalues and their associated Eigenvectors and chains of generalized Eigenvectors. In this chapter, we will develop this concept itself, both analytically and by appeal to physical intuition. The rigorous develop- ment, we will defer to later chapters, particularly Chapter VIII. 1. BASIC CONCEPT We consider the equation y = AX (1) and say that A is an operation that maps the space of possible x onto the space of possible y. If A is an n x n matrix, then we can consider the linear vector space S of n-dimensional vectors, and say that A maps S onto or into itself. Now many things can happen. If we consider all x in S, then the situation can be quite complicated. However, among all of S, there may be, and in fact will be, certain vectors that behave in a particularly simple way. These vectors are simply stretched or compressed, or their phase is changed. That is, the effect of A on them simply multiplies them by some scalar. If xi is such a vector, then AX^ = hixi (2) where Xi is a scalar quantity. A vector xi that satisfies Eq. (2) is an Eigenvector of A, and the corre- sponding scalar Ai is an eigenvalue. Other terms are also used. Some auihors call the vectors proper vectors or characteristic vectors and the scalars proper values or charac- teristic values. The terms Eigenvectors and eigenvalues seem to be winning out, however, in spite of their confused etymology (eigenvalue is a semitranslation of the German word “eigenwerte”), and we shall use them here. 66 1. BASIC CONCEPT 67 i We can easily illustrate what is happening here from strain theory. Suppose we put a block of material under a pure compression along the y-axis. Then it expands along the x-axis as shown in Fig. I . - eBook - PDF
Linear Algebra
A First Course with Applications
- Larry E. Knop(Author)
- 2008(Publication Date)
- Chapman and Hall/CRC(Publisher)
The collection of Eigenvectors corresponding to an eigenvalue need not be just multiples of some vector, and that in turn raises the question: Just what can we say about the collection of Eigenvectors of an eigenvalue? De fi nition 3 : Let A be an n n matrix, let T A : R n ! R n be the linear transformation de fi ned by T A ( X ) ¼ AX , and let l be an eigenvalue of A . The eigenspace of A (equivalently, the eigenspace of T A ) associated with l is the set of all Eigenvectors associated with l together with the 0 vector. - eBook - PDF
Linear Algebra
Examples and Applications
- Alain M Robert(Author)
- 2005(Publication Date)
- WSPC(Publisher)
It may be possible to describe the evolution of this vector by a matrix multiplication: v’ = Av. If the population changes, one may be especially concerned by the variation (or preservation) of the shape of the pyramid of ages. In the transition from one generation to the next one, this shape is preserved precisely when the vector representing the population is simply multiplied by a scalar v’ = Xv. In other words, assuming that we know the matrix A, can we find a stable pyramid of ages? This problem leads to the theory which is explained below. 6.2 Definitions and Examples Let us call operator any linear map from a vector space E into itself. 6.2.1 Definitions Definition. A n Eigenvector of an operator T is an element v E E such that v # 0 and Tv is proportional to v. We then write T v = Xv with a scalar A. A nonzero vector v is an Eigenvector of T when T v = Xv, Tv - Xv = 0, (T - XI)v = 0, v E ker(T - XI). Definition. A n eigenvalue of an operator T is a scalar X such that ker(T - XI) # (0). The nonzero elements of ker(T - X I ) are the Eigenvectors of T corresponding to the eigenvalue A. The eigenvalues are the special values of a variable z such that ker(T - X I ) # (0). When E is a finite-dimensional space, these are the values of z such that the rank of T - z1 is not maximal. Definition. If X is an eigenvalue of an operator T in a vector space E, V, = {v E E : T v = Xv} = ker(T - X I ) is the eigenspace of T relative to the eigenvalue A. Its dimension m, = dimV, = dimker(T - XI) 146 CHAPTER 6. EigenvectorS AND EIGEWALUES is the geometric multiplicity of the eigenvalue A. By definition X eigenvalue of T Vx = ker(T - X I ) # (0) w mA >, 1, and the geometric multiplicity of X is the maximal number of linearly inde- pendent Eigenvectors that can be found for this eigenvalue. - eBook - PDF
Numerical Algorithms
Methods for Computer Vision, Machine Learning, and Graphics
- Justin Solomon(Author)
- 2015(Publication Date)
- A K Peters/CRC Press(Publisher)
Eigenvectors ~x and corresponding eigenvalues λ of a square matrix A are determined by the equation A~x = λ~x . There are many ways to see that the eigenvalue problem is nonlinear. For instance, there is a product of unknowns λ and ~x . Furthermore, to avoid the trivial solution ~x = ~ 0, we constrain k ~x k 2 = 1; this constraint keeps ~x on the unit sphere, which is not a vector space. Thanks to this structure, algorithms for finding eigenspaces will be considerably different from techniques for solving and analyzing linear systems of equations. 6.1 MOTIVATION Despite the arbitrary-looking form of the equation A~x = λ~x , the problem of finding eigenvec-tors and eigenvalues arises naturally in many circumstances. To illustrate this point, before presenting algorithms for finding Eigenvectors and eigenvalues we motivate our discussion with a few examples. It is worth reminding ourselves of one source of eigenvalue problems already considered in Chapter 1. As explained in Example 1.27, the following fact will guide many of our examples: 107 108 Numerical Algorithms x i { c ˆ v : c ∈ R } ˆ v x i − proj ˆ v x i (a) Input data (b) Principal axis (c) Projection error Figure 6.1 (a) A dataset with correlation between the horizontal and vertical axes; (b) we seek the unit vector ˆ v such that all data points are well-approximated by some point along span { ˆ v } ; (c) to find ˆ v , we can minimize the sum of squared residual norms ∑ i k ~x i -proj ˆ v ~x i k 2 2 with the constraint that k ˆ v k 2 = 1 . When A is symmetric, the Eigenvectors of A are the critical points of ~x > A~x under the constraint k ~x k 2 = 1 . Many eigenvalue problems are constructed using this fact as a starting point. 6.1.1 Statistics Suppose we have machinery for collecting statistical observations about a collection of items. For instance, in a medical study we may collect the age, weight, blood pressure, and heart rate of every patient in a hospital.
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.











