Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống
1
/ 639 trang
THÔNG TIN TÀI LIỆU
Thông tin cơ bản
Định dạng
Số trang
639
Dung lượng
40,97 MB
Nội dung
Howard Anton Drexel University Robert C Busby Drexel University John Wiley & Sons, Inc ACQUlSITIONS EDITOR MARKETING MANAGER Laurie Rosatone Julie Z Lindstrom SENIOR PRODUCTION EDITOR Ken Santor PHOTO EDITOR Sara Wight COVER DESIGN Madelyn Lesure ILLUSTRATION STUDIO ATI Illustration Services Cover Art: The mandrill picture on this cover has long been a favorite of researchers studying techniques of image compression The colored image in the center was scanned in black and white and the bordering images were rendered with various levels of image compression using the method of "singu lar value decomposition" discussed in this text for its storage and the amount Of time required to transmit it over the internet In practice, Compressing an image blurs some of the detail but reduces the amount of space required one tries to strike tbe right balance between compression and clarity This book was set in Times Roman by Techsetters, Inc and printed and bound by Von Hoffman Corporation The cover was printed by Phoenix Color Corporation This book is printed on acid-free paper @I Copyright 2003 Anton Textbooks Inc All rights reserved No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording scanning or otherwise, except as permitted under Sections 107 or 108 of the 1976 United States Copyright Act, without either the prior wriuen permission of the Publs i her or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 646-8600 Requests to the Publisher for permission should be addressed to the Permissions Department John Wiley & Sons, lnc., Ill River Street Hoboken, NJ 07030, (201) 748- 601 , fax (201) 748-6008 To order books or for customer service please, call 1(800)-CALL-WlLEY (225-5945) ISBN 978-0-471-16362-6 Printed in the United States ofAmerica 10 ABOUT THE AUTHORS Howard Anton obtained his B.A from Lehigh University, his M.A from the University of illinois, and his Ph.D from the Polytechnic University of Brooklyn, all in mathematics In the early 1960s be worked for Burroughs Corporation and Avco Corporation at Cape Canaveral, Florida (now the Kennedy Robert C Busby obtained his B.S in physics from Drexel University and his M.A and Ph.D in mathematics from the University of Pennsylvania He taught at Oakland University in Rochester, Michigan, and since 1969 has taught full time at Drexel University, where he currently holds the position of Professor in the Department of Mathematics and Computer Science He has regularly taught courses in calculus, lin Space Center), on mathematical problems related to the manned space program In 1968 he joined the Mathematics Department at Drexel University, where he taught fuU time ear algebra, probability and statistics, and modern analysis until 1983 Since then be has been an adjunct professor at Drexel and has devoted the majority of his time to textbook writing and projects for mathematical associations He was Dr Busby is the author of numerous research articles in func tional analysis, representation theory, and operator algebras, and he has coauthored an undergraduate text in discrete math president of the Eastern Pennsylvania and Delaware section of the Mathematical Association of America (MAA), served on the Board of Governors of the MAA, and guided the cre ematical structures and a workbook on the use of Maple in calculus His current professional interests include aspects of signal processing and the use of computer technology in ation of its Student Chapters He has published numerous undergraduate education Professor Busby also enjoys con research papers in functional analysis, approximation theory, and topology, as weU as various pedagogical papers He is best known for his textbooks in mathematics, which have temporary jazz and computer graphic design He and his wife, Patricia, have two sons, Robert and Scott been widely used for more than thirty years There are cur rently more than 125 versions of his books used through out the world, including translations into Spanish, Arabic, Portuguese, Italian, Indonesian, French, Japanese, Chinese, Hebrew, and German In 1994 he was awarded the Textbook Excellence Award by the Textbook Authors Association For relaxation, Dr Anton enjoys traveling, photography, and art v CONTENTS CHAPTER Vectors cHAPTER 329 Dimension and Structure 329 1.1 Vectors and Matrices in Engineering and 7.1 Mathematics; n-Space 7.2 Properties of Bases 1.2 Dot Product and Orthogonality 7.3 The Fundamental Spaces of a Matrix 1.3 Vector Equations of Lines and Planes 7.4 The Dimension Theorem and Its Implications 7.5 The Rank Theorem and Its Implications 7.6 The Pivot Theorem and Its Implications cHAPTER 15 29 39 Systems of Linear Equations 2.1 Introduction to Systems of Linear Equations 2.2 Solving Linear Systems by Row Reduction 2.3 Applications of Linear Systems cHAPTER 63 79 3.1 Operations on Matrices 3.2 Inverses; Algebraic Properties of Matrices 3.3 Elementary Matrices; A Method for FindingA109 3.4 Subspaces and Linear Independence 3.5 The Geometry of Linear Systems 3.6 Matrices with Special Forms 3.7 Matrix Factorizations; LU-Decomposition 3.8 Partitioned Matrices and Parallel Processing cHAPTER Determinants Determinants; Cofactor Expansion 4.2 Properties of Determinants 4.3 166 175 225 225 5.1 Dynamical Systems and Markov Chains 5.2 Leontief Input-Output Models 5.3 Gauss-Seidel and Jacobi Iteration; Sparse Linear Systems 5.4 235 241 The Power Method; Application to Internet Search Engines CHAPTERs 249 Linear Transformations 6.1 Matrices as Transformations 6.2 Geometry of Linear Operators 296 265 265 280 6.3 Kernel and Range 6.4 Composition and Invertibility of Linear Transformations 6.5 305 Computer Graphics 318 417 428 Coordinates with Respect to a Basis Diagonalization 443 Matrix Representations of Linear 443 456 8.2 Similarity and DiagonalizabiUty 8.3 Orthogonal Diagonalizability; Functions 468 of a Matrix 481 8.4 Quadratic Forms 8.5 Application of Quadratic Forms 495 502 8.6 Singular Value Decomposition 8.7 The Pseudoinverse 8.8 Complex Eigenvalues and Eigenvectors 8.9 Hermitian, Unitary, and Normal Matrices 8.10 Systems of Differential Equations cHAPTERs 518 555 9.1 Vector Space Axioms Inner Product Spaces; Fourier Series 9.3 General Linear Transformations; How to Read Theorems APPENDIX B CompleX Numbers 1·1 A1 A3 ANSWERS TO ODD-NUMBERED EXERCISES C1 569 582 APPENDIX A PHOTO CREDITS 535 555 General Vector Spaces Isomorphism 525 542 9.2 INDEX 379 406 to Optimization 21 393 352 QR-Decomposition; Householder Transformations 196 Matrix Models Orthonormal Bases and the Gram-Schmidt 8.1 A First Look at Eigenvalues and Eigenvectors CHAPTER 7.9 CHAPTER a Cramer's Rule; Formula for A -J; Applications of Determinants 4.4 184 Best Approximation and Least Squares 7.11 154 360 370 The Projection Theorem and Its Implications Transformations 123 342 7.7 Process 94 335 7.8 7.10 135 175 4.1 48 79 Matrices and Matrix Algebra 143 39 Basis and Dimension A9 xi GUIDE FOR THE INSTRUCTOR Number of Lectures The SyLlabus Guide below provides for a 29-lecture core and a 35-lecture core The 29-lecture core is for schools with time constraints, as with abbreviated summer courses Both core programs can be supplemented by starred topics as time pemuts The onlission of starred topics does not affect the readability or continuity of the core topics a core topic, you can so by adjusting the number of starred topics that you cover By the end of Lecture 15 the following concepts will have been covered in a basic form: linear combination, spanning, subspace, dimension, eigenvalues, and eigenvectors Thus, even with a relatively slow pace you will have no trouble touching on all of the main ideas in the course Pace The core program is based on covering one section per lec ture, but whether you can this in every instance will depend on your teaching style and the capabilities of your particular students For lon er sections we recommend that you just highlight the main points in class and leave the details for the students to read Since the reviews of this text have praised the clarity of the exposition, you should find this workable If, in certain cases, you want to devote more than one lecture to g Organization It is our feeling that the most effective way to teach abstract vector spaces is to place that material at the end (Chapter 9), at which point it occurs as a "natural generalization" of the earlier material, and the student has developed the ")jnear al gebra maturity" to understand its purpose However, we rec ognize that not everybody shares that philosophy, so we have designed that chapter so it can be moved forward, if desired SYLLABUS GUIDE 35-Lecture Course CONTENTS Vectors Chapter 1.1 1.2 L3 Systems of Linear Equations Chapter2 * 2.1 2.2 2.3 * 12 3.1 3.2 3.3 3.4 3.5 3.6 3.7 3.8 13 xii Operations on Matrices Inverses; Algebraic Properties of Matrices Elementary Matrices; A Method for Finding A -I Subspaces and Linear Independence The Geometry of Linear Systems Matrices with Special Forms Matrix Factorizations; LU-Decomposition Partitioned Matrices and Parallel Processing Determinants Cbapter4 14 Introduction to Systems of Linear Equations Solving Linear Systems by Row Reduction App)jcations of Linear Systems Matrices and Matrix Algebra Chapter3 10 lJ Vectors and Matrices in Engineering and Mathematics; n-Space Dot Product and Orthogonality Vector Equations of Lines and Planes 4.1 4.2 Determinants; Cofactor Expansion Properties of Determinants 29-Lecture Course Chapter 1 Cbapter2 * Chapter3 10 1l * 12 Chapter4 13 * Guide for the Instructor * 15 4.3 Cramer's Rule; Formula for A -I; Applications of Determinants 4.4 A First Look at Eigenvalues and Eigenvectors ChapterS * * * * Matrix Models 5.1 Dynamical Systems and Markov Chains 5.2 Leontief Input-Output Models 5.3 Gauss-Seidel and Jacobi Iteration; Sparse Linear Systems 5.4 The Power Method; Application to Internet Search Engines Chapter6 Linear Transformations * 14 Chapter * * * * Cbapter6 16 6.1 Matrices as Transfol11)ations 17 6.2 Geometry of Linear Operators 16 18 6.3 Kernel and Range 17 19 6.4 Composition and Invertibility of Linear Transformations 6.5 Computer Graphics * Cbapter7 Dimension and Structure 15 * * Chapter7 20 7.1 Basis and Dimension 18 21 7.2 Properties of Bases 19 22 7.3 The Fundamental Spaces of a Matrix 20 23 7.4 The Dimension Theorem and Its Implications 21 24 7.5 The Rank Theorem and Its Implications 25 7.6 The Pivot Theorem and Its Implications 26 7.7 The Projection Theorem and Its Implications 22 27 7.8 Best Approximation and Least Squares 23 28 7.9 Orthonormal Bases and the Gram-Schmidt Process 7.10 QR-Decomposition; Householder Transformations 7.11 Coordinates with Respect to a Basis * 29 Chapter Diagonalization * * * * 24 ChapterS 25 30 8.1 Matrix Representations of Linear Transformations 31 8.2 Similarity and Diagonalizability 26 32 8.3 Orthogonal Diagonalizability; Functions of a Matrix 27 8.4 Quadratic Forms 8.5 Application of Quadratic Forms to Optimization 8.6 Singular Value Decomposition 8.7 The Pseudoinverse 8.8 Complex Eigenvalues and Eigenvectors 8.9 Hermitian, Unitary, and Normal Matrices 8.10 Systems of Differential Equations * * * * * * * General Vector Spaces Cbapter9 * * * * * * * Cbapter9 33 9.1 Vector Space Axioms 28 34 9.2 loner Product Spaces; Fourier Series 29 35 9.3 General Linear Transformations; Isomorphism * Appendices Appendix A How to Read Theorems Appendix B Complex Numbers * * xiii TOPIC PLANNER To assist you in planning your course, we have provided below a list of topics that occur in each section These topics are identified in the text by headings in the margin You will.find additional lecture planning information on the Web site for this text CHAPTER VECTORS and Columns of a Matrix Product • Matrix Products as Lin 1.1 Vectors and Matrices in Engineering and Mathe matics; n-Space Scalars and Vectors • Equivalent Vectors • Vector Addition • Vector Subtraction • Scalar Multipli cation • Vectors in Coordinate Systems • Components of a Vector Whose Initial Vectors in Point Is Not at the Origin • R" • Equality of Vectors • Sums of Three or More Vectors • Parallel and Collinear Vectors • Linear Combinations • Application to Computer Color Models • Alternative Notations for Vectors • Matrices 1.2 Dot Product and Orthogonality Norm of a Vector ear Combinations • Transpose of a Matrix • Trace • Inner and Outer Matrix Products 3.2 Inverses; Algebraic Properties of Matrices Properties of Matrix Addition and Scalar Multiplication • Properties of Matrix Multiplication • Zero Matrices • Identity Matrices • Inverse of a Matrix • Properties of Inverses • Powers of a Matrix • Matrix Polynomials • Properties of the Transpose • Properties of the Trace • Transpose and Dot Product 3.3 Elementary Matrices; A Method for Finding A -l Elementary Matrices • Characterizations of InvertibWty • • Unit Vectors • The Standard Unit Vectors • Distance Row Equivalence • An Algorithm for Inverting Matrices of the Dot Product • Angle Between Vectors in R2 and R3 Multiple Linear Systems with a Common Coefficient Matrix Between Points in W • Dot Products • Algebraic Properties • Orthogonality • Orthonormal Sets • Euclidean Geometry in R" 1.3 Vector Equations of Lines and Planes Vector and Parametric Equations of Lines • Lines Through Two Points • Point-Normal Equations of Planes • Vector and Parametric Equations of Planes • Lines and Planes in R" • Comments on Termi.nology CHAPTER • Solving Linear Systems by Matrix Inversion • Solving • Consistency of Linear Systems 3.4 Subspaces and Linear Independence Subspaces of R" • Solution Space of a Linear System • Linear Inde pendence • Linear Independence and Homogeneous Linear Systems • Translated Subspaces • A Unifying Theorem 3.5 The GeoQ'letry of Linear Systems The Relationship Ax= b and Ax= • Consistency of a Linear Between System from the Vector Point of View • Hyperplanes • SYSTEMS OF LINEAR EQUATIONS Geometric Interpretations of Solution Spaces 2.1 Introduction to Systems of Linear Equations Linear Systems • Linear Systems with Two and Three Unknowns • Augmented Matrices and Elementary Row Operations 2.2 Solving Linear Systems by Row Reduction Consider ations in Solving Linear Systems • Echelon Forms • General Solutions as Linear Combinations of Column Vectors • 3.6 Matrices with Special Forms Diagonal Matrices • Triangular Matrices • Linear Systems with Triangular Coefficient Matrices • Properties of Triangular Matrices • Symmetric and Skew-Symmetric Matrices • lnvertibility of Symmetric Matrices • Matrices of the Form ArA and A Is Nilpotent • When Systems • The DimensionTheorem for Homogeneous Linear 3.7 Matri.x Factorizations; LU-Decomposition Systems • Stability, Roundoff Error, and Partial Pivoting 2.3 Applications of Linear Systems Global Positioning • Network Analysis • Electrical Circuits • Balancing Chemi cal Equations • Polynomial Interpolation CHAPTER MATRICES AND MATRIX ALGEBRA Linear Systems by Solving Factorization • Finding LU-Decom positions • The Relationship Between Gaussian Elimination and LU-Decomposition • Matrix Inversion by LU -Decom posi6on • WU -Decompositions • Using Permutation Ma trices to Deal with Row Interchanges • Flops and the Cost of Solving a Linear System • Cost Estimates for Solving Large Linear Systems • Considerations in Choosing an Algorithm for Solving a Linear System 3.1 Operations on Matrices Matrix Notation and Ter minology • Operations on Matrices • Row and Column Vectors • The Product Ax • The Product AB • Finding 3.8 Partitioned Matrices and ParaiJel Processing Specific Entries in a Matrix Product • Finding Specific Rows Triangular Matrices xiv A Inverting I- A by Power Series Gauss-Jordan and Gaussian Elimination • Some Facts About Echelon Fonns • Back Substitution • Homogeneous Linear AAT • Fixed Points of a Matrix • A Technique for Inverting I - Gen eral Partitioning • Block Diagonal Matrices • Block Upper Topic Planner CHAPTER DETERMINANTS 4.1 Determinants; Cofactor Expansion Detenninants of x and Matrices • Elementary Products • General Determinants • Evaluation Difficulties for Higher-Order Determinants • Determinants of Matrices with Rows or Columns That Have All Zeros • Detenninants of Triangular Matrices • Minors and Cofactors • Cofactor Expansions 3x3 4.2 Properties of Determinants Detenninant of AT • Effect of Elementary Row Operations on a Determi nant • Simplifying Cofactor Expansions • Determinants by Gaussian Elimination • A Detenninant Test for lnvertibility • Determinant of a Product of Matrices • Determinant Evaluation by LU-Decomposition • Detenninant of the Inverse of a Matrix • Determinant of A + B • A Unifying Theorem 4.3 Cramer's Rule; Formula for A -I; Applications of Determinants Adjoint of a Matrix • A Formula for the Inverse of a Matrix • How the Inverse Formula Is Actually Used • Cramer's Rule • Geometric Interpretation of Deter minants • Polynomial Interpolation and the Vandermonde Detenninant • Cross Products 4.4 A First Look at Eigenvalues and Eigenvectors Fixed Points • Eigenvalues and Eigenvectors • Eigenvalues of Triangular Matrices • Eigenvalues of Powers of a Matrix • A Unifying Theorem • Complex Eigenvalues • Algebraic Multiplicity • Eigenvalue Analysis of x Matrices • Eigenvalue Analysis of x Symmetric Matrices • Expres sions for Detenninant and Trace in Terms of Eigenvalues • Eigenvalues by Numerical Methods CHAPTER MATRIX MODELS 5.1 Dynamical Systems and Markov Chains Dynamical Systems • Markov Chains • Markov Chains as Powers of the Transition Matrix • Long-Term Behavior of a Markov Chain 5.2 Leontieflnput-Output Models Inputs and Outputs in an Economy • The Leontief Model of an Open Economy • Productive Open Economies 5.3 Gauss-Seidel and Jacobi Iteration; Sparse Linear Iterative Methods • Jacobi Iteration • Gauss Seidel Iteration • Convergence • Speeding Up Convergence Systems xv Properties of Linear Transformations • All Linear Trans formations from R11 to R"' Are Matrix Transformations • Rotations About the Origin • Reflections About Lines Through the Origin • Orthogonal Projections onto Lines Through the Origin • Transformations of the Unit Square • Power Sequences 6.2 Geometry of Linear Operators Norm-Preserving Linear Operators • Orthogonal Operators Preserve Angles and Orthogonality • Orthogonal Matrices • All Orthogonal Linear Operators on R2 are Rotations or Reflections • Contractions and Dilations of R1 • Vertical and Horizontal Compressions and Expansions of R • Shears • Linear Operators on R3 • Reflections About Coordinate Planes • Rotations in R • General Rotations 6.3 Kernel and Range Kernel of a Linear Transformation • Kernel of a Matrix Transformation • Range of a Linear Transformation • Range of a Matrix Transformation • Existence and Uniqueness Issues • One-to-One and Onto from the Viewpoint of Linear Systems • A Unifying Theorem 6.4 Composition and lnvertibility of Linear Transfor mations Compositions of Linear Transformations • Compositions of Three or More Linear Transformations • Factoring Linear Operators into Compositions • Inverse of a Linear Transformation • Invertible Linear Operators • Geometric Properties of Invertible Linear Operators on R2 • Image of the Unit Square Under an Invertible Linear Operator 6.5 Computer Graphics Wireframes • Matrix Represen tations ofWireframes • TransformingWireframes • Transla tion Using Homogeneous Coordinates • Three-Dimensional Graphics CHAPTER DIMENSION AND STRUCTURE 7.1 Basis and Dimension Bases for Subspaces • Dimen sion of a Solution Space • Dimension of a Hyperplane 7.2 Properties of Bases Properties of Bases • Subspaces of Subspaces • Sometimes Spanning Implies Linear Indepen dence and Conversely • A Unifying Theorem Engines The Fun 7.3 The Fundamental Spaces of a Matrix damental Spaces of a Matrix • Orthogonal Complements • Properties of Orthogonal Complements • Finding Bases by Row Reduction • DeterminingWhether a Vector Is in a Given Subspace CHAPTER 7.4 The Dimension Theorem and Its Implications The Dimension Theorem for Matrices • Extending a Linearly Independent Set to a Basis • Some Consequences of the Dimension Theorem for Matrices • The Dimension Theorem for Subspaces • A Unifying Theorem • More on Hyper planes • Rank Matrices • Symmetric Rank Matrices 5.4 The Power Method; Application to Internet Search The Power Method • The Power Method with Euclidean Scaling • The Power Method with Maximum Entry Scaling • Rate of Convergence • Stopping Procedures • An Application of the Power Method to Internet Searches • Variations of the Power Method LINEAR TRANSFORMATIONS 6.1 Matrices as Transformations A Review of Functions • Matrix Transformations • Linear Transformations • Some xvi Topic Planner 7.5 The Rank Theorem and Its Implications The Rank Theorem • Relationship Between Consistency and Rank • Overdetermined and Underdeterrnined Linear Systems • Matrices of the form A TA and A A T • Some Unifying Theorems • Relationship Between Algebraic and Geometric Multiplicity Applications of Rank Basis 7.6 The Pivot Theorem and Its Implications Problems Revisited • Bases for the Fundamental Spaces of a Matrix • A Column-Row Factorization • Column-Row Expansion 7.7 The Projection Theorem and Its Implications Or R2 • Orthogonal Projections onto Lines Through the Origin in Rn • Projection Operators on R" • Orthogonal Projections onto General Subspaces • When Does a Matrix Represent an Orthogonal Projection? • Strang Diagrams • Full Column • The Double Perp Theorem • Orthogonal Projections onto W.L Rank and Consistency of a Linear System 7.8 Best Approximation and Least Squares Distance Problems Minimum • Least Squares Solutions of Linear • Finding Least Squares Solutions of Linear Systems • Orthogonality Property of Least Squares Error Vectors • Strang Diagrams for Least Squares Problems • Fitting a Curve to Experimental Data • Least Squares Fits by Higher-Degree Polynomials • Theory Versus Practice Systems 7.9 Orthonormal Bases and the Gram-Schmidt Process Orthogonal and Orthonormal Bases • Orthogonal Projections • Trace and Orthogonal Projections • Linear Combinations of Orthonormal Basis Vectors • Finding Orthogonal and Orthonormal Bases • A Property of the Gram-Schmidt Process • Extending Orthonormal Sets to Using Orthogonal Bases Orthonormal Bases A Unifying Theorem on Diagonalizability 8.3 Orthogonal Diagonalizability; Functions of a Matrix Orthogonal Similarity • A Method for Orthogonally Di • Spectral Decomposition QR-Deoomposition • The Role of QR-Decomposition in Least Squares Problems holder Reflections • Powers of a Diagonalizable Matrix • Cayley-Hamilton • Exponential of a Matrix • Diagonalization and Linear Systems • The Nondiagonalizable Case Theorem 8.4 Quadratic Forms Definition of a Quadratic Form • Change of Variable in a Quadratic Form • Quadratic Forms in Geometry • Identifying Conic Sections • Positive Definite Quadratic Forms • Classifying Conic Sections Using Eigenvalues • Identifying Positive Definite Matrices • Cholesky Factorization 8.5 Application of Quadra tic Forms to Optimization Relative Extrema of Functions of Two Variables strained Extremum Problems • Con • Constrained Extrema and Level Curves 8.6 SinguJar Value Decomposition Singular Value De composition of Square Matrices • Singular Value Decom position of Symmetric Matrices • Polar Decomposition • • Si,ngular Value Decomposition of Nonsquare Matrices Singular Value Decomposition and the Fundamental Spaces of a Matrix e Reduced Singular Value Decomposition Data Compression and Image Processing • • Singular Value Decomposition from the Transformation Point of View 7.10 QR-Decomposition; Householder Transformations • Other Numerical Issues • House • QR-Decomposition Using Householder • Householder Reflections in Applications 7.11 Coordinates with Respect to a Basis Nonrectangular Coordinate Systems in R2 and R3 to an Orthonormal Basis • Coordinates with Respect • Computing with Coordinates with Respect to an Orthononormal Basis • Change of Basis for R" • lnvertibility ofTransition Matrices • A Good Technique for finding Transition Matrices • Coordinate Maps • Transition Between Orthonormal Bases • Application to Rotation of Coordinate Axes • New Ways to Think About Matrices CHAPTER • agonalizing a Symmetric Matrix thogonal Projections onto Lines Through the Origin in Reflections 8.2 Similarity and DiagonaJizability Similar Matrices • Similarity invariants • Eigenvectors and Eigenvalues of Similar Matrices • Diagonalization • A Method for Diago nalizing a Matrix • Linear Independence of Eigenvectors • DIAGONAUZATION 8.1 Matrix Representations of Linear Transformations Matrix of a Linear Operator with Respect to a Basis 8.7 The Pseudoinverse The Pseudoinverse • Properties of the Pseudoinverse • The Pseudoinverse and Least Squares • Condition Number and Numerical Considerations 8.8 Complex Eigenvalues and Eigenvectors Vectors in en • Algebraic Properties of the Complex Conjugate • The Complex Euclidean Inner Product • Vector Space Concepts in en • Complex Eigenvalues of Real Matrices Acting on Vectors in C" • A Proof That Real Symmetric Matrices Have Real Eigenvalues • A Geometric Interpretation of Complex Eigenvalues of Real Matrices 8.9 Hermitian, Unitary, and Normal Matrices and Unitary Matrices Hermitian Matrices Eigenvalues Hermitian • Unitary Diagonalizability • Skew • Normal Matrices • A Comparison of 8.10 Systems of Differential Equations Terminology Changing Bases • • Matrix of a Linear Transformation with Respect to a Pair of Bases • Effect of Changing Bases on Matrices of Linear Transformations • Representing Linear • Fundamental Solutions • Solutions Using Eigenvalues and Eigenvectors • Exponential Form of a Solution • The Case Where A is Not Operators with Two Bases Diagonalizable • Linear Systems of Differential Equations A24 Answers to Odd-Numbered Exercises Exercise Set 7.11 (Page 438) (a) (w)8 = (3, -7), [w)o (�, �), [w], (b) (w), � [-�J = [-:] 11 (a) u = (f, -f), v = ( -'f, !f) 13 vffS, � � �.s 20 19 (a) (e) [w]� � [-! 23 I i2 (b)[w],, - -� m � Ps1-s1 = '! J6 25 (a) Po-.s = (b) (w], � � J (wJ•• � nl HJ I -273 273 I - 273 I 372 pr = PLs [� �] = 2./3 = Ps4 B J2 J2 x' y'-coordinates of J2 J2 [ fi /6 fil _[=:]:] , [ T (b) (- · '32 · -3) T T -:If t fi /6 fi T T Exercise Set 8.1 (Page 453) [T]o = G -�] _l [ i _fJ - [ : - : J [: -�J G -�J [-� � J = -� I 5, [T], � [-: : -1�] [�I �] [-: - = I 11 - : X[-; -; _;] 2 2 _l!J [ *] [� -*] [-fi *] [t t] [� -�] [: _:] [T]o = 25 -il [ T] = => J2 J2 T _9 [ 7' J ' � [[� ' �] -�] J6 - 372 Ps1-+ B• = [w],, � 4 21 (a) [=�J , [ w]o 31 (b) J2, Ji?, (d) [w ]8 = [-fol 6) ' ' ' ' y y x x = - - - -, y = - - - (-t J2 tJ2) ( 72· - 7J � ) (e) (w], � (d) [w]o1 = X 29 (a) (� s) (b) J2 x'y'-coordinates of (5, 2) => xy-coordinates of [� - !] � [-� �] [_;] , [_:] � [�! -�] (b) � [-� -��] = [-4] t (w], � J2 xy-coordinates of (-2, (b) 15 (0, J2), ( I , 0), (-1, J2), (a - b, bJ2) 17 (a) X y X Y , / (a)x = + - , y = - - (4J2, -2J2) ( -2J2 5J2) (6, - , 3) 27 [:] � (w), � (3, -2, 1), [w], � -n 26 il = [� -�] , 15, (o) [T(x)]o (x J, � 26 il 25 -il � I ' ' I -2 [ T]8 = [:l , [-;] [T] = TI 19 il [=! �] TI [ TJo � II il II [-! -!] , I 19 [T]8 ·8 [-* -!4] = -t8 T4 T4 -�] , = - �] =[ 21 (a) [T(v1 ))o (b) T (v t ) [ [T(v2)]o r = /4 21 I.A.I = 2, Â> /3 = [-4 2OJã C = [3 -23] ii m X = , det(A) = :> _ 3• •• [fi " ] 17 "' = 1, A., 1r 23 p 2 [2 1, = -1r A = ] [ ] = = -�� ';!! [ lf!i -�jf] = [-o/' 7!] [� �] 15 [-�' 7!] = [� �] 17 [ � ; �], = [-� � �] 00 � 3i 19 [ � � ] -2 - 3i -1 4i 21 29 = [��] = [ ���, ] J [ ml ·�l u [:] [;:] A• = � -;I -jl A- 11 A• 13 P P= P= A A -1 Jj 76 76 Jj D = D "* � D = (a) a,3 ;{: -(a3,) (c) (b) a11 ;{: -Oil and C must commute ., Exercise Set 8.10 (Page 552) -y 5e-31 general solution: Y3 c3e fuod� w �· ·'···"·;··· sol"rioo of;mum "'"' pmbl•m general solution: [�] = c1 [-�] e_, c1 = c2 = for initial conditions: � + c2 [!] es1 21 [ �] m [;:] Answers to Odd-Numbered Exercises A28 ool•tio" '' [- :] ··� •"; � < , , +� foe;";" """d;tioos; 'ã r y,(t) = 1fe-' + Ơe-< • f2>r Yl(l) = -25e- + 65e-