Linear System Theory
http://www.ee.washington.edu/class/547/2011aut/
http://www.ee.washington.edu/class/547/2011aut/
Introduction
asdfMathematical Description of Systems
asdfLinear Algebra
linear algebra essential to the coursedealing with real number space only
Introduction
- Matrix multiplication
- column interpretation
- A made of columns a1 to an
- C[a1 ... am] = [Ca1 ... Ca2]
- row interpretation
- B made of rows b1 to bn
- [b1 ... bn]'A = [b1D ... bmD]'
- row & column interpretation
- Linear independence
- linear combination of set of vectors
- dimension of linear space
Basis, Representation, Orthonormalization
- Basis
- set of lin. independent vectors that span the linear space
- Rn has basis of dimension n\
- Orthonormal basis
- Representation of x w.r.t basis Q
- a is the representation where x = Qa
- x is representation of itself w.r.t. basis I (identity)
- Norm of vector x
- real valued function ||x||
- properties to satisfy
- positiveness
- 0 if x = 0
- scalar mult.
- triangle inequality
- common types of norms
- Manhattan norm
- euclidean norm (default)
- infinite norm
- Orthonormalization
- orthogonal + normal
- schmidt ortho. procedure
- orthonormal column matrix A
- A'A = I
Linear Algebraic Equations
General equation
- Ax = y
- x = n vector; y = m vector
- m equation, n unknowns
- range space of A
- rank of A
- # linearly independent columns of A
- equals # linearly independent rows of A
- null space of A
- nullity of A
- # columns of A - rank of A
- existence of solutions
- IFF y is in range space of A
- exists for every y if A is full row rank (rank m)
- parametrization of solutions
- xp is a solution to Ax = y
- xp is unique if nullity is 0
- x = xp
- xp + span of null space of A if nullity > 0
- x = xp + V{n1...nk} for nullity = k
- xA = y
- analogous theorems and terminology to Ax = y
- skipped
Equations with square matrix
- Determinant
- defining rank using determinants
- laplace expansion, cofactor, minor
- Nonsingular square matrix
- determinant is nonzero
- full rank
- inverse exists
- Inverse
- AA^-1 = A^-1A = I
- 2x2 computation
- Ax = y, A square
- if nonsingular, solution is unique
- x = A-1y
- Ax = 0 has nonzero solutions IFF A is singular
- solutions are the null space of A
- Otherwise, Ax = 0 --> x = 0
Similarity Transformation
- nxn matrix maps Rn to itself
- Representation of matrix A
- w.r.t identity basis I
- ith column of A is representation of Aii
- AI = [Ai1 Ai2 ... Ain] = I[a1 a2 ... an] = IA
- w.r.t general basis Q
- ith column of Arep is representation of Aqi
- AQ = [Aq1 Aq2 ... Aqn] = [QArep1 QArep2 ... QArepn] = QArep
- Companion form
- TODO
- Deriving representation of A w.r.t [b,Ab,A2b] Example. TODO
- Ax = y in terms of different basis
- Abarxbar = ybar
- xbar, ybar
- x and y w.r.t. basis Q
- x = Qxbar, y = Qybar
- Abar
- Similarity transformation between A and Abar
- Ax = y --> AQxbar = Qybar --> Abarxbar = ybar
- Abar = Q-1AQ
Jordan Form
derive set of basis so that representation of A is diagonal/block diagonalDefinitions
- eigenvalue of A
- real/complex number l
- Ax = lx
- nxn matrix A has n eigenvalues
- not necessarily all distinct
- eigenvector of A
- x in the above equation corresponding to each eigenvalue
- not unique, any scalar multiple works
- Calculating eigenvalues/vectors
- (A-lI)x = 0
- homogenous equation, need to find nonzero solutions
- find l such that A-lI is singular
- Characteristic Equation
- det(A-lI) - monic polynomial of degree n
- A-lI is singular <--> determinant is 0
- roots of characteristic equation are the eigenvalues of A
- companion form matrix can be formed using the coefficients
- TODO
Distinct eigenvalues
- n eigenvalues -> n linearly independent eigenvectors
- Aqi = liqi
- {q1 q2 ... qn} form a basis
- diagonal matrix representation
- Abar = diagonal matrix with eigenvalues as diagonal values
- Abar = Q-1AQ
- easily checked using QAbar = AQ
Repeated Eigenvalues
- repeated eigenvalues of A
- eigenvalues with multiplicity higher than 1
- 1 = simple eigenvalue
- results in block-diagonal & triangular-form representation
- nxn matrix with 1 eigenvalue of multiplicity n
- solutions to (A-lI)q = 0
- #independent solutions = nullity of (A-lI)
- need (n - #independent solutions above) more solutions to form basis
- provided by generalized eigenvectors
- Generalized eigenvectors
- of grade n = v
- (A-lI)nv = 0
- (A-lI)n-1v != 0
- chain of generalized eigenvectors
- for each independent solution to (A-lI) = 0
- obtain needed eigenvectors of increasing grade
- eg: for n = 4, nullity = 1
- eigenvectors {v1...v4}
- (A-lI)v1 = 0
- (A-lI)2v2 = 0
- (A-lI)3v3 = 0
- (A-lI)4v4 = 0
- Representation of A w.r.t basis {TODO}
- jordan block of order n
- TODO
- for 1 eigenvalue and nullity = 1
- eigenvalue on diagonal and 1 on superdiagonal
- due to one chain of generalized eigenvectors
- for 1 eigenvalue and nullity = 2
- two linearly independent eigenvectors
- need (n-2) generalized eigenvectors
- two chains of generalized eigenvectors
- {v1...vk,u1...um} and k+m = n
- v and u are the two linearly independent soln.
- two jordan blocks of order k and m
- A = diag{J1,J2}
- for 2 eigenvalues, l1 repeated and l2 simple
- l1 nullity = 1
- one jordan block with l2 on the 5th diagonal cell
- l1 nullity = 2
- two jordan blocks assosiated with l1
- l1 nullity = 3
- three jordan blocks (with two of them degenerated)
- l1 nullity = 4
- four degenerated jordan blocks
- degenerated jordan blocks
- jordan blocks with order 1
- defective matrix
- a matrix that cannot be diagonalized
- instead, it's converted to jordan form
Properties
- det(CD) = det(C)det(D)
- det(Q)det(Q-1) = det(I) = 1
- det(A) = det(Ajordan)
- det(Ajordan) = product of all eigenvalues
- A is nonsingular IFF it has no 0 eigenvalue
- nilpotent matrix J
- (J-lI)n = 0 for n>=k, some k
Functions of Square Matrix
properties of functions viewed i.t.o jordan form
Polynomials of Square Matrix
- powers of matrix A
- A^k = A*A*... n times
- A^0 = I
- A^k = Q-1AjkQ
- polynomial function of matix
- f(x) for some scalar x
- polynomials we've seen in high school
- f(A) for some matrix A
- same polynomial as f(x) w/ x replaced by A
- f(A) for some block diagonal A = [A1 0; 0 A2] (A1A2 are square matrix blocks)
- A^k = [A1^k 0; 0 A2^k 0]
- f(A) = [f(A1) 0; 0 f(A2)]
- function in terms of jordan form
- f(A) = Q-1f(Aj)Q
- monic polynomial
- 1 as the leading coefficient
- minimal polynomial of A
- psi(l)
- monic polynomial of least degree such that psi(A) = 0 (nxn 0 matrix)
- A and its jordan form have same minimal polynomial
- calculation
- TODO
- equal to characteristic polynomial if
- all eigenvalues are distinct
- psi(A) = 0
Cayley-Hamilton Theorem
define function of A using polynomial of finite degree- characteristic polynomial of A is satisfied by A
- c(l) = det(lI-A) --> c(A) = det(AI-A) = 0
- Minimal polynomial of A is satisfied by A
- psi(l) is a factor of c(l), therefore psi(A) = 0 -> c(l) = 0
- An and higher powers of A can be written as linear combination of
- {I, A, ..., An-1}
- polynomial function f(A) of any degree
- can be expressed as linear combination of {I, A, ..., An-1}
- f(A) = bI + b1A + ... + bn-1An-1
- can be expressed even smaller as lin. comb. of {I, A, ..., Am-1}
- m is degree of minimal polynomial of A
- computing h(A) = bI + b1A + ... + bn-1An-1 for f(A)
- spectrum of A
- h(l) and f(l) are equal on spectrum of A if f(A) = h(A)
- higher derivatives of l are equal as well
- use long division
- f(l) = quotient(l)*det(l) + reminder(l)
- f(A) = 0 + reminder(A) = h(A)
- use spectrum properties of A to get n equations for b's
- f(l) = quotient(l)*det(l) + reminder(l)
- f(l) = reminder(l) = h(l) when l = eigenvalue of A
- also works for higher derivatives of l
- obtain m equations for each of the m eigenvalues of A
- for repeated roots of order p
- obtain p equations by taking ith derivative of f and h, and plugging in eigenvalue
- use n equations to solve for b and compute h(A)
Functions of a Square Matrix
- f(l) can be any function, not just polynomials
- f(l) = h(l) on the spectrum of A
- h(l) is a polynomial of degree n-1
- n = order of A
- examples
- A^100
- f(l) = l^100
- exp(At)
- when deriving equations for repeated roots, differentiation is w.r.t. l (not t)
- exp(At) for A = jordan block of order 4
- h(l) = b0 + b1(l - l1) + b2(l - l1)2 + b3(l - l1)3
- allows for easy calculation of b's
- h(A) = b + b1(A - l1I) + b2(A - l1I)2 + b3(A - l1I)3
- bi = f(i)(l1)/i!
- (A-lI)^n have special properties (refer nilpotent matrices)
- exp(At) for A = block diagonal with 2 jordan blocks
- calculate each jordan block independently
- (s-l)^-1 for A above
Power Series
define function of A as polynomial of infinite degree- radius of convergence p
- f(l) = sum(i=0 to inf) {bili}
- within radius of convergence
- f(A) = sum(i=0 to inf) {biAi}
- eigenvalues of A have magnitudes less than p
- for Jordan form matrix A
- infinite series reduces to same as cayley-hamilton method
- (A-lI)^k = 0 for k>=n
Taylor Series
define exp(At) using taylor series expansion- exp(lt) = 1 + lt + l2t2/2! + l3t3/3! + ...
- exp(At) = 1 + At + A2t2/2! + A3t3/3! + ...
- Properties of exp(At)
- exp(zero matrix) = I
- exp(A(t1+t2)) = exp(At1)exp(At2)
- exp(At)^-1 = exp(-At)
- inverse of exp(At) obtained by changing sign of t
- d/dt{exp(At)} = A*exp(At) = exp(At)*A
- exp((A+B)t) != exp(At)exp(Bt)
- unless AB = BA (commutive)
- L[exp(At)] = (sI-A)^-1
- holds for all s except at eigenvalues of A