MATH 307 - Applied Linear Algebra
This course explores the practical side of linear algebra, focusing on how matrix methods are used to solve real-world problems. We study matrix decompositions such as LU Decomposition, QR Decomposition, and the Singular Value Decomposition (SVD), which provide powerful tools for understanding and computing with matrices. These techniques allow us to efficiently solve Linear Systems of Equations, compute Least-Squares Approximation, and approximate Eigenvalues through iterative algorithms. Alongside the theory, we develop computational skills using Python, SciPy, and Jupyter, applying linear algebra to modern contexts such as digital signal analysis and data-driven modelling.
Topics Covered
Systems of Equations
Orthogonality
Eigenvalues
Discrete Fourier Transforms
Examples
Systems of Equations
- Gaussian Elimination With A Parameter - Given a system with a parameter
, find the values such that the system has: no solution/infinite solutions/unique solution. - Find LU Decomposition - Find
matrices from . - Solve Using LU Decomposition - Given
solve a the system . - Find Condition Number - Given a series of transformation matrices find the condition number.
- Using the General Interpolation Framework - Given an
and points determine whether we can interpolate the points with . - Find Cubic Spline Interpolation Coefficients - Given a Cubic Spline matrix of known and unknown coefficients, determine the values of the unknown coefficients.
- Custom Cubic Interpolation - Given points
we define a cubic interpolation composed of just two piecewise cubic functions.
Orthogonality
- Check if in Subspace - Check if a given vector
is contained in a given span. - Check if Linearly Dependant - Determining whether a subspace of vectors are linearly dependant.
- Finding Basis and Dimension - Find the basis of a span of vectors and their dimension.
- Basis and Dimension of Null Space - Find the basis and dimension of a null space given a matrix.
- Basis and Dimension of Range - Find a basis and the dimension of the Range of a given matrix.
- Finding Orthogonal Subspaces - Find all orthogonal subspaces to a given subspace
. - Finding Fundamental Subspaces - Find all fundamental subspaces of a given matrix
. - Calculating Projection Matrices - Find the Projection Matrix and Orthogonal Projection Matrix of a subspace
. - Find Shortest Distance - Find the shortest distance between a vector
and a subspace . - Find QR Decomposition - Find the QR Decomposition of a given matrix, then project a vector onto the range of the matrix.
- Fitting a Linear Function - Fit a linear function to a give data set using least-squares approx.
Eigenvalues
- Find SVD - Find the SVD Decomposition of a given matrix.
- Conduct PCA - Conduct PCA on a given dataset.
- Using Power Method - Use power method to find the dominant eigenvalue of a given matrix.
Lecture Summary
(Lecture 1) We start by defining what exactly is are Linear Systems of Equations, then we learn how to transform those systems into Compact Systems and define the Augmented Matrices. Next we learn Row-Echelon-Form (REF) and Elementary Row Operations. Finally we use row operations to perform Gaussian Elimination on systems.
(Lecture 2) We start by defining what a Pivot is. We then learn what Rank and it's consequences are. Finally we started LU Decomposition where we first needed to lear about Upper & Lower Triangular Matrices and Unit Upper & Lower Triangular Matrices.
(Lecture 3) We finish learning about LU Decomposition, work through examples and learn it's benefits.
(Lecture 4) We go over reminders about LU Decomposition and then introduce the concept of Relative Error Analysis this included the use of Vector Norms and Matrix Norms to find the Condition Number as well as a discussion around Max & Min Stretch.
(Lecture 5) Go over examples of finding Condition Number and then learn Properties of Condition Number. Next we introduce the concept of Interpolating Functions and start learning about Polynomial Interpolation.
(Lecture 6) Finished discussion of Polynomial Interpolation introduce the concept of General Interpolation Framework and then introduce Cubic Spline Interpolation.
(Lecture 7) We finished our discussion of interpolation and went over an example of Custom Cubic Interpolation. We then started reviewing Subspaces, Linear Combinations, Span, Linear Dependance, and Basis.
(Lecture 8) We do an example of finding Basis and it's Dimension which finishes our discussion of Basis. We then learn about Null-Space and Range and go over a few theorems related to these concepts including the Rank-Nullity Theorem which relates the two Subspaces.
(Lecture 9) We start by defining the Inner Product and look at it's properties. We then define Orthogonality using the inner product. Next we define Orthogonal Sets, Orthogonal Subspaces, and the Orthogonal Compliment. We then relate the properties of the orthogonal compliment to the subspaces introduced last lecture.
(Lecture 10) We start by introducing the idea of Fundamental Subspaces we then go over an example of finding the fundamental subspaces of a matrix
~~ End of MT1 Content ~~
(Lecture 11) We start with reviewing the Projection Matrix by going over some basic properties. We then look at how to project onto an orthogonal subspace (Orthogonal Projection Matrix). We also see that we can express any vector in an orthogonal basis using the Theorem 11.4. Finally we learn an important relation between the Projection Matrix and the Orthogonal Projection Matrix. We conclude the lecture by going over an example on how to calculate the Projection & Orthogonal Projection Matrices.
(Lecture 12) Shortened due to MT1. We start by learning the Gram-Schmidt Method. We then go over a simple example.
(Lecture 13) We start by reviewing the Gram-Schmidt, we then look at the Projection Theorem and go over an example. We then look at Orthogonal Matrices and their properties. Next we look at an example of an orthogonal matrix, the Reflection Matrix.
(Lecture 14) We start by defining QR Decomposition, we then learn how to find
(Lecture 15) We start by defining the Least-Squares Approximation method. We then learn the two ways of finding the least-squares approximation. Method 1 was Normal Equations and Method 2 was QR Equations. Next we look at Fitting Models to Data using the least squares approximation.
(Lecture 16) We start by defining Eigenvalues and Eigenvectors, we then look at the look at how to calculate each, and learn what the Characteristic Polynomial is. Next we look at Algebraic Multiplicity and Geometric Multiplicity and how they relate to the Characteristic Polynomial. We then conclude the lecture by introducing Diagonalization.
(Lecture 17) We start by defining Orthogonal Diagonalization we then look at Spectral Theorem and it's consequences, before defining Singular Value Decomposition (SVD) and going over the motivation and calculation of the constituent matrices.
(Lecture 18) We continue Singular Value Decomposition this time looking at the Construction of SVD and we then go over important properties.
(Lecture 19) This lecture we introduce an application of SVD, Principle Component Analysis (PCA). We then introduce Pseudo Inverse Matrices.
~~ End of MT2 Content ~~
(Lecture 20) This lecture we finish our discussion of Pseudo-inverse matrices, we then look at Least Squares Approximation with Pseudo Inverse Equations. Next we look at the Power Method for and the Rayleigh Quotient.
(Lecture 21) Shortened due to MT2. We introduce the concept of a Discrete Fourier Transform but focus our attention on a review of Complex numbers. This includes: Complex Numbers, Euler's Formula. We then introduce Complex Vectors, and the Complex Inner Product.