mathematics...for engineering mathematics.....learn maths...............................The individual items in a matrix are called its elements or entries.[4] Provided that they are the same size (have the same number of rows and the same number of columns), two matrices can be added or subtracted element by element. The rule for matrix multiplication, however, is that two matrices can be multiplied only when the number of columns in the first equals the number of rows in the second. Any matrix can be multiplied element-wise by a scalar from its associated field. A major application of matrices is to represent linear transformations, that is, generalizations of linear functions such as f(x) = 4x. For example, the rotation of vectors in three dimensional space is a linear transformation which can be represented by a rotation matrix R: if v is a column vector (a matrix with only one column) describing the position of a point in space, the product Rv is a column vector describing the position of that point after a rotation. The product of two transformation matrices is a matrix that represents the composition of two linear transformations. Another application of matrices is in the solution of systems of linear equations. If the matrix is square, it is possible to deduce some of its properties by computing its determinant. For example, a square matrix has an inverse if and only if its determinant is not zero. Insight into the geometry of a linear transformation is obtainable (along with other information) from the matrix's eigenvalues and eigenvectors.
Applications of matrices are found in most scientific fields. In every branch of physics, including classical mechanics, optics, electromagnetism, quantum mechanics, and quantum electrodynamics, they are used to study physical phenomena, such as the motion of rigid bodies. In computer graphics, they are used to project a 3-dimensional image onto a 2-dimensional screen. In probability theory and statistics, stochastic matrices are used to describe sets of probabilities; for instance, they are used within the PageRank algorithm that ranks the pages in a Google search.[5] Matrix calculus generalizes classical analytical notions such as derivatives and exponentials to higher dimensions.
A major branch of numerical analysis is devoted to the development of efficient algorithms for matrix computations, a subject that is centuries old and is today an expanding area of research. Matrix decomposition methods simplify computations, both theoretically and practically. Algorithms that are tailored to particular matrix structures, such as sparse matrices and near-diagonal matrices, expedite computations in finite element method and other computations. Infinite matrices occur in planetary theory and in atomic theory. A simple example of an infinite matrix is the matrix representing the derivative operator, which acts on the Taylor series of a function
...
3. The Eigenvalue Problem
Consider a nxn matrix A
Vector equation: Ax = λx
» Seek solutions for x and λ
» λ satisfying the equation are the eigenvalues
» Eigenvalues can be real and/or imaginary; distinct and/or
repeated
» x satisfying the equation are the eigenvectors
Nomenclature
» The set of all eigenvalues is called the spectrum
» Absolute value of an eigenvalue:
» The largest of the absolute values of the eigenvalues is
called the spectral radius
22
baiba jj +=⇒+= λλ
4. Determining Eigenvalues
Vector equation
» Ax = λx (A-λΙ)x = 0
» A-λΙ is called the characteristic matrix
Non-trivial solutions exist if and only if:
» This is called the characteristic equation
Characteristic polynomial
» nth-order polynomial in λ
» Roots are the eigenvalues {λ1, λ2, …, λn}
0)det(
21
22221
11211
=
−
−
−
=−
λ
λ
λ
λ
nnnn
n
n
aaa
aaa
aaa
IA
6. Eigenvalue Properties
Eigenvalues of A and AT
are equal
Singular matrix has at least one zero eigenvalue
Eigenvalues of A-1
: 1/λ1, 1/λ2, …, 1/λn
Eigenvalues of diagonal and triangular matrices are
equal to the diagonal elements
Trace
Determinant
∑=
=
n
j
jTr
1
)( λA
∏=
=
n
j
j
1
λA
7. Determining Eigenvectors
First determine eigenvalues: {λ1, λ2, …, λn}
Then determine eigenvector corresponding to
each eigenvalue:
Eigenvectors determined up to scalar multiple
Distinct eigenvalues
» Produce linearly independent eigenvectors
Repeated eigenvalues
» Produce linearly dependent eigenvectors
» Procedure to determine eigenvectors more complex (see
text)
» Will demonstrate in Matlab
0)(0)( =−⇒=− kk xIAxIA λλ
8. Eigenvector Example
Eigenvalues
Determine eigenvectors: Ax = λx
Eigenvector for λ1 = -5
Eigenvector for λ1 = 2
−
=
−
=⇒
=+
=+
3
1
or
9487.0
3162.0
03
026
11
21
21
xx
xx
xx
=
=⇒
=−
=+−
1
2
or
4472.0
8944.0
063
02
22
21
21
xx
xx
xx
2
5
43
21
2
1
=
−=
−
=
λ
λ
A
0)4(3
02)1(
43
2
21
21
221
121
=+−
=+−
⇒
=−
=+
xx
xx
xxx
xxx
λ
λ
λ
λ
9. Matlab Examples
>> A=[ 1 2; 3 -4];
>> e=eig(A)
e =
2
-5
>> [X,e] = eig(A)
X =
0.8944 -0.3162
0.4472 0.9487
e =
2 0
0 -5
>> A=[2 5; 0 2];
>> e=eig(A)
e =
2
2
>> [X,e]=eig(A)
X =
1.0000 -1.0000
0 0.0000
e =
2 0
0 2
10. Vector Spaces
Real vector space V
» Set of all n-dimensional vectors with real elements
» Often denoted Rn
» Element of real vector space denoted
Properties of a real vector space
» Vector addition
» Scalar multiplication
V∈x
0aawvuwvu
a0aabba
=−+++=++
=++=+
)()()(
aaaaa
aababa
=+=+
=+=+
1)(
)()()(
kckc
ckkcccc
11. Vector Spaces cont.
Linearly independent vectors
» Elements:
» Linear combination:
» Equation satisfied only for cj = 0
Basis
» n-dimensional vector space V contains exactly n linearly
independent vectors
» Any n linearly independent vectors form a basis for V
» Any element of V can be expressed as a linear
combination of the basis vectors
Example: unit basis vectors in R3
021 =+++ (m)(2)(1) aaa mccc
V∈(m)(2)(1) aaa ,,,
=
+
+
=++=
3
2
1
3213321
1
0
0
0
1
0
0
0
1
c
c
c
cccccc )((2)(1) aaax
12. Inner Product Spaces
Inner product
Properties of an inner product space
Two vectors with zero inner product are called orthogonal
Relationship to vector norm
» Euclidean norm
» General norm
» Unit vector: ||a|| = 1
∑=
+++==⋅==
n
k
nnkk
T
babababa
1
2211),( bababa
0ifonlyandif0),(0),(
),(),(
),(),(),( 2121
==≥
=
+=+
aaaaa
acca
cbcacba qqqq
22
2
2
1),( n
T
aaa +++=== aaaaa
babababa +≤+≤),(
13. Linear Transformation
Properties of a linear operator F
» Linear operator example: multiplication by a matrix
» Nonlinear operator example: Euclidean norm
Linear transformation
Invertible transformation
» Often called a coordinate transformation
)()()()()( xxxvxv cFcFFFF =+=+
Axy
A
yx
=
∈
∈∈
tionTransforma
Operator
,Elements
xnm
mn
R
RR
yAx
Axy
Ayx
1
x
tionTransformaInverse
tionTransforma
,,Dimensions
−
=
=
∈∈∈ nnnn
RRR
14. Orthogonal Transformations
Orthogonal matrix
» A square matrix satisfying: AT
= A-1
» Determinant has value +1 or -1
» Eigenvalues are real or complex conjugate pairs with
absolute value of unity
» A square matrix is orthonormal if:
Orthogonal transformation
» y = Ax where A is an orthogonal matrix
» Preserves the inner product between any two vectors
» The norm is also invariant to orthogonal transformation
bavuAbvAau ⋅=⋅⇒== ,
=
≠
=
kj
kj
k
T
j
if1
if0
aa
vbua ==
15. Similarity Transformations
Eigenbasis
» If a nxn matrix has n distinct eigenvalues, the
eigenvectors form a basis for Rn
» The eigenvectors of a symmetric matrix form an
orthonormal basis for Rn
» If a nxn matrix has repeated eigenvalues, the
eigenvectors may not form a basis for Rn
(see text)
Similar matrices
» Two nxn matrices are similar if there exists a
nonsingular nxn matrix P such that:
» Similar matrices have the same eigenvalues
» If x is an eigenvector of A, then y = P-1
x is an
eigenvector of the similar matrix
APPA 1ˆ −
=
16. Matrix Diagonalization
Assume the nxn matrix A has an eigenbasis
Form the nxn modal matrix X with the eigenvectors
of A as column vectors: X = [x1, x2, …, xn]
Then the similar matrix D = X-1
AX is diagonal with
the eigenvalues of A as the diagonal elements
Companion relation: XDX-1
= A
==⇒
= −
nnnnn
n
n
aaa
aaa
aaa
λ
λ
λ
00
00
00
2
1
1
21
22221
11211
AXXDA