================================================================ OVERVIEW [ ~ 6 min each ] @ alg. vs. geom. proofs & techniques @ Reflection matrix, ex nihilo @ Properties of Q: algebrically @ Determinant and matrix norm @ Properties of Q: geometrically @ Row reduction using Gaussian elimination @ Gaussian elimination without pivoting @ Gaussian elimination with pivoting @ Computation of dets @ Householder for upper-triangularization @ Matrix inversion? ================================================================ @ alg. vs. geom. proofs & techniques even for numerical stability which might seem more algebraic than geometric ================================================================ @ Reflection matrix, ex nihilo Geom. construction: z |--> -z v |--> refl'n Subproblem: proj'n v = vpar + vperp vpar: dir'n z hat and magnitude ||v|| cos theta but, z.v = ||z|| ||v|| cos theta So, z hat ||v|| cos theta = z/||z|| ||z|| ||v|| cos theta / ||z|| = z z.v/z.z Now back to reflection: Qv = v - 2 vpar Qv = v - 2 z z.v/z.z = (I - 2 z z^t/z^t z) v Q = I - 2 z z^t/z^t z ================================================================ @ Properties of Q: algebrically Symmetric: Q = Q^t. Easy. Orthogonal: Compose (I - 2zzt/ztz) with itself. Involutary: Same, since Q = Q^t ================================================================ @ Determinant and matrix norm Det 1 means volume preserving, but *not* necessarily norm-preserving. There are matrices with det 1 which do not preserve norm. Shear matrix: [ 1 0 ] [ a 1 ] Draw a picture. ================================================================ @ Properties of Q: geometrically Symmetric: = Obvious? Orthogonal: = Involutary: Determinant: Use image of unit cube. Also note that rigid rotations have det 1 (special orthogonal). Matrix norm: Orthogonal, so norm-preserving. ================================================================ @ Row reduction using Gaussian elimination consider matrices of height 2 Det -1, norm 1: [ 0 1 ] [ 1 0 ] Det m, norm max(1,m): [ m 0 ] [ 0 1 ] Det 1, norm ??: [ 1 0 ] [ a 1 ] ================================================================ @ Gaussian elimination without pivoting Do the example. Focus on the geometry. ================================================================ @ Gaussian elimination with pivoting Do the example. Focus on the geometry. Point: Successful Gaussian elimination REQUIRES pivoting, which is a data-dependent decision. Also, we must store the permutations. ================================================================ @ Computation of dets n! from def'n If upper tri, then product along diagonal. Account for determinants of transform matrices: det(Gn ... G2 G1 A) = det(Gn) ... det(G2) det(G1) det(A) Gau ~ n^3? Check ... Will see HH also ~ n^3 Mult det by -1 at each step ================================================================ @ Householder for upper-triangularization Put A= * * * * * * * * * into the form QA= * * * 0 * * 0 * * Let v = column 0 of A, and w = column 0 of QA. Q is norm-preserving so ||w|| = ||v||. So, w[0] = +/-||v||. What is z? Draw picture. Suggest z = v - w. Prove this is correct: Q(z) = ... = -z. Choose sign of w[0] to avoid cancellation error. Then make Q as above Det -1 so there you go. Error analysis -- well, it's *orthogonal* so: Q(v+e) = Q(v) + Q(e) but ||Q(e)|| = ||e||. Most importantly: No data-dependent decisions. Make some reference to dumb processors at work. ================================================================ @ Matrix inversion?