LECTURE NOTES FOR mp204_274
- Lecture 1: pp 1-11 (26th Feb/3rd March 1999)
The vector space axioms, examples: ℝ^{n}, set of all functions f : S->ℝ, simple deductions from the axioms, subspaces, linear combinations, examples of subspaces: <v_{1},...,v_{n}> (the subspace spanned by v_{1},...,v_{n}), R(A), C(A), N(A) (the row, column and null spaces of a matrix A), numerical examples of null spaces, subspace of Fibonacci sequences.
- Lecture 2: pp 12-15 (5th March 1999)
Linear dependence of a list of vectors, connections with AX=0 having a nontrivial solution, linear independence, columns of a square matrix A are linearly independent if and only if A is non-singular, the left-to-right test for independence, the fundamental theorem of linear algebra:
a list of n vectors, each of which is a linear combination of m given vectors, is linearly dependent if n > m,
finite-dimensional vector spaces, basis and dimension of a vector space, coordinate vector [v]_{β} of a vector v relative to a basis β.
- Lecture 3: pp 16-23 (10th March 1999)
Example of coordinate vector, columns of an n x n non-singular matrix form a basis for ℝ^{n}, change of basis matrix/change of coordinates matrix, example, a non-trivial finite-dimensional vector space has a basis, the left-to-right basis for the column space of a matrix, a basis for the row space, example.
- Lecture 4: pp 24-32 (12th March 1999)
Basis for the null space of a matrix, example, any two bases of a vector space have the same number of elements, dim(V) (the dimension of vector space V), rank(A), nullity(A), rank(A)+nullity(A)=n if A is m x n, a subspace U of a finite-dimensional dimensional vector space V is also finite-dimensional; moreover (i) dim(U) ≤ dim(V), (ii) dim(U)=dim(V) implies U=V, a linearly independent list of n vectors in a vector space V of dimension n is a basis for V, application to finding a formula for the n-th member of a Fibonacci sequence.
- Lecture 5: pp 33-38 (17th March 1999)
Finished discussion on Fibonacci sequences - special case of theFibonacci numbers, generalisation to k-termed linear recurrence relations, extension of a linearly independent family to a basis, the subspace U+V, dim(U+V) ≤ dim(U)+dim(V), dim(U+V)+dim(U ∩ V=dim(U)+dim(V).
- Lecture 6: pp 39-46 (19th March 1999)
Revision example for calculating bases for C(A), R(A), N(A), worked example on
the dimension formula, finding a spanning family for U ∩ V, example in ℝ^{5}, the vector space U⊕V, dim(U⊕V)=dim(U)+dim(V).
- Lecture 7: pp 47-55 (24th March 1999)
Linear transformations, example: T_{A}: ℝ^{n}->ℝ^{m}, rotations and reflections in the x-y plane, more examples of linear mappings: differentiation, the difference operator, kernel and image of a linear transformation T:U->V, (Ker(T) and Im(T)), Ker(T) is a subspace of U, Im(T) is a subspace of V, Im(T)=< T(u_{1}),...,T(u_{n})> if U=<u_{1},...,u_{n}>, Ker(T_{A})=N(A), Im(T_{A})=C(A), rank T=dim(Im(T)), nullity T=dim(Ker(T)).
- Lecture 8: pp 56-60 (26th March 1999)
worked example on Ker(T) and Im(T), rank(T)+nullity(T)=dim(U), if T:U->V is a linear transformation, a posh proof of the formula dim(U⊕V)=dim(U)+dim(V), a linear transformation is determined by its action on a basis.
- Lecture 9: pp 61-68 (31st March 1999)
Example on last theorem, the matrix A= of a linear transformation T, [T(u)]_{γ}=A[u]_{β}, recipes for computing bases for Ker(T) and Im(T) from A, an example.
- Lecture 10: pp 69-74 (14th April 1999)
L(U,V), the vector space of all linear transformations T: U->V, T_{1}+T_{2}, -T, the zero transformation 0, I_{V} - the identity transformation on V, T_{2}T_{1} - the composite of two transformations, , T^{n}, f(T), .
- Lecture 11: pp 75-82 (17th April 1999)
Injective and surjective linear transformations, T is injective if and only if Ker T={0}, T_{A} is (i) injective iff the columns of A are LI, (ii) surjective iff the rows of A are LI, generalisation of this result to T: U->V, isomorphism, theorems on isomorphisms, applications to matrices, T^{-1} - the inverse of an isomorphism.
- Lecture 12: pp 83-89 (21st April 1999)
T^{-1}: V->U, is an isomorphism if T: U->V is an isomorphism, (T_{A})^{-1}=T_{A-1}, connections between isomorphisms and non-singular matrices, some examples, dim U=dim V implies T: U->V is an isomorphism if (i) T is injective or (ii) T is surjective, example.
- Lecture 13: pp 90-94 (23rd April 1999)
Change of basis theorem for the matrix of a linear transformation, application to finding all matrices such that A^{2}=A, similarity of matrices, similarity is an equivalence relation, diagonable matrices, application to finding A^{n} and hence solving recurrence relations X_{m+1}=AX_{m}.
- Lecture 14: pp 95-102 (28th April 1999)
Solving dX/dt=AX when A is diagonable, review of determinants, evaluating determinants via row echelon form, cofactor expansion of a determinant, adj A - the adjoint of A, A*(adj A)=det(A)I_{n}=(adj A)*A, A^{-1}=(adj A)/det(A), AX=0 has a non-trivial solution iff det(A)=0.
- Lecture 15: pp 103-107 (30th April 1999)
The van der Monde determinant, det (T), where T: V->V is a linear transformation, eigenvalues and eigenvectors, A (n x n) is diagonable over ℝ iff ℝ^{n} has a basis of n eigenvectors of A, t is an eigenvalue of A iff det(tI_{n}-A)=0, ch_{A}(x)=det(xI_{n}-A) (the characteristic polynomial of A).
- Lecture 16: pp 108-115 (5th May 1999)
Cayley-Hamilton theorem, E_{A}(t)=N(tI_{n}-A) - the eigenspace of A corresponding to the eigenvalue t, g_{A}(t)=dim(E_{A}(t)) (the geometric multiplicity of t), a_{A}(t) - the algebraic multiplicity of t, g_{A}(t) ≤ a_{A}(t), eigenvectors corresponding to distinct eigenvectors are linearly independent, if A is n x n and has n distinct eigenvalues, then A is diagonable, a necessary and sufficient condition for A to be diagonable is (i) ch_{A}(x) splits completely and (ii) the geometric and algebraic multiplicities of all eigenvalues are equal, J_{n}(c) is not diagonable, 3 x 3 example.
- Lecture 17: pp 116-121 (7th May 1999)
Decomposition into principal idempotents using the Lagrange interpolation polynomials, these form a basis for P_{n}[ℝ], calculating f(A) from the idempotent decomposition of A, direct sum of matrices, G_{A}(t) - the generalised eigenspace of A corresponding to the eigenvalue t.
- Lecture 18: pp 122-128 (12th May 1999)
G_{A}(t) is a subspace of ℝ^{n}, G_{A}(t)=N((A-tI_{n})^{aA(t)}), dim (G_{A}(t))=a_{A}(t), generalised eigenspaces corresponding to distinct eigenvalues of A are independent, the block upper triangular form algorithm, 4 x 4 example, finding A^{n} using the block upper triangular form of A.
- Lecture 19: pp 129-134 (14th May 1999)
Computing A^{n} for the 4 x 4 example, A^{n}-> 0 if all eigenvalues t of A satisfy |t| < 1, calculating (I_{n}-A)^{-1} if all eigenvalues have absolute value less than 1, 3 x 3 example of finding a block upper triangular form, a necessary condition for two matrices to be similar, the integers b_{A}(t).
- Lecture 20: pp 134-141 (19th May 1999)
Classification of Jordan forms of 3 x 3 matrices, definition of the Jordan form J_{A} of A, finding a non-singular matrix P such that P^{-1}AP=J_{A} and 4 x 4 example, 6 x 6 example of finding J_{A} only.
- Lecture 21: pp 142-146 (21st May 1999)
Real inner product spaces, examples - including the Euclidean inner product, ||v|| - the length of v, properties of vector length, Cauchy-Schwarz inequality, the triangle inequality, angle between two vectors.
- Lecture 22: pp 147-151 (26th May 1999)
Orthogonal vectors, non-zero mutually orthogonal vectors are linearly independent, orthonormal family of vectors, orthogonal matrix, A is orthogonal iff the columns of A form an orthonormal family, the general 2 x 2 orthogonal matrix, the Gram matrix of a family of vectors, the Gram matrix of a family of vectors is non-singular iff the family is linearly independent, clarification: more on finding the transforming matrix P for the Jordan form.
- Lecture 23: pp 152-156 (28th May 1999)
Projection of a vector onto a subspace, application to least squares solution of AX=B, the Gram-Schmidt orthogonalisation process, finite-dimensional inner product spaces have orthonormal bases.
- Lecture 24: pp 157-165 (2nd June 1999)
Example of the Gram-Schmidt process, extension of an orthonormal family to an orthonormal basis, Parseval's and Bessel's inequalities, real symmetric matrices and quadratic forms, the eigenvalues of a real symmetric matrix are real, a real symmetric matrix is diagonable, eigenvectors corresponding to distinct eigenvalues of a real symmetric matrix are mutually orthogonal, a real symmetric matrix is orthogonally diagonable, application to sketching second degree equations such as 2x^{2}+2xy+2y^{2}=1.
- Lecture 25: pp 166-171 (4th June 1999)
3 x 3 real symmetric matrix example and sketching X^{t}AX=c, positive definite matrices, A is positive definite if A=P^{t}P, where P is non-singular, A is positive definite iff all the eigenvalues of A are positive, A is positive definite implies A=P^{t}P, where P is non-singular, LDU decomposition of a matrix, determinantal criterion for positive definiteness.
Back to the MP204 page
Last updated at 3rd July 2006