The linked PPT below gets modified as I improve, some times as simple as correcting typos.
Today (11/14/15), I uploaded a version that included eigen value-eigen vectors decomposition. This is the foundation that are utilized in the PCA, Factor analysis and many other multivariate methods that helps reduce the variables/dimensions in data. A important tool that will also help with big data in reducing the big data with out loosing its information content.
Today (9/4/15), I added the actual calculation of inverse by row echelon transformations. Today(9/12/15), I added a slide on what is meant by RowEchelon matrix and its properties. It is beautiful how we can dig all the properties of linear algebra using solving systems of linear equations as foundations.
All the way up to row echelon sweeping method to solving for linear equations vs. finding inverse of a matrix.
With that in mind, I found these valuable videos on Elementary matrices, Part 1 where the author defines elementary matrices.
Part2: Elementary matrices are invertible
A square matrix is invertible if and only if it is a product of elementary matrices
MIT – A full semester course on Linear Algebra for those who are interested in the advanced learning of matrices. The lectures 2 to 34 can be traced on the right column of the youtube references, sequentially.
Extensions of this lead one to appreciation of why the row echelon type calculations in linear programming method with slack variables work.
It will also lead to eigen values, eigen vectors, matrix decompositions, Singular Value Decomposition, and generalized inverses of matrices.
Oh…all the beauty; there is beautiful order in life…