Education Chapter 7 Homework First note that the determinant of an orthogonal matrix is ±1

subject Type Homework Help
subject Pages 9
subject Words 3344
subject Authors David C. Lay

Unlock document.

This document is partially blurred.
Unlock all pages and 1 million more documents.
Get Access
page-pf1
7.4 • Solutions 439
1/ 2 1/ 2
25: , 9:
1/ 2 1/ 2
⎡⎤ ⎡ ⎤
==
⎢⎥ ⎢ ⎥
⎢⎥ ⎢ ⎥
⎣⎦ ⎣ ⎦
2
93.
σ
==
Thus the matrix Σ is
50
03.
00
Σ=
Next compute
Since
12
{, }uu
is not a basis for
3
, we need a unit vector
3
u
that is orthogonal to both
1
u
and
2
.u
The vector
3
u
must satisfy the set of equations
1
0
T
=ux
and
2
0.
T
=ux
These are equivalent to the
linear equations
Therefore let
1/ 2 1/ 18 2/3
1/ 2 1/ 18 2/3 .
04/18 1/3
U
⎡⎤
−−
⎢⎥
=⎢⎥
⎢⎥
⎢⎥
⎣⎦
Thus
An SVD for A is computed by taking transposes:
1/ 2 1/ 2 0
14. From Exercise 7,
T
AUV
with
2/ 5 1/ 5 .
1/ 5 2/ 5
V
=
Since the first column of V is unit
15. a. Since A has 2 nonzero singular values, rank A = 2.
page-pf2
440 CHAPTER 7 Symmetric Matrices and Quadratic Forms
.40 .78
⎧⎫
⎡⎤⎡⎤
.58
⎧⎫
⎡⎤
16. a. Since A has 2 nonzero singular values, rank A = 2.
b . By Example 6,
12
.86 .11
{ , } .31 , .68
.41 .73
⎧⎫−−
⎡⎤
⎪⎪
⎢⎥
=⎨⎬
⎢⎥
⎪⎪
⎢⎥
⎣⎦
⎩⎭
uu
is a basis for Col A and
17. Let
1
.
T
AUV UV
=Σ =Σ
Since A is square and invertible, rank A = n, and all of the entries on the
18. First note that the determinant of an orthogonal matrix is ±1, because
1det det
T
IUU== =
19. Since U and V are orthogonal matrices,
1
() () ()
TTTTTTTTTT
AA UV UV V UUV V V V V
Σ =Σ Σ =ΣΣ =ΣΣ
If
,,
σσ
are the diagonal entries in Σ, then
T
ΣΣ
is a diagonal matrix with diagonal entries
20. If A is positive definite, then
T
APDP=
, where P is an orthogonal matrix and D is a diagonal matrix.
The diagonal entries of D are positive because they are the eigenvalues of a positive definite matrix.
21. Let
.
T
AUV
The matrix PU is orthogonal, because P and U are both orthogonal. (See Exercise
page-pf3
7.4 • Solutions 441
22. The right singular vector
1
v
is an eigenvector for the largest eigenvector
1
of
.
T
AA
By Theorem 7
23. From the proof of Theorem 10,
[]
11
.
rr
U
σσ
Σ= … uu00
The column-row expansion
of the product
()
T
UVΣ
shows that
24. From Exercise 23,
111
.
TT T
rrr
A
σσ
=++vu vu
Then since
0for ,
1for
T
ij
ij
ij
==
uu
25. Consider the SVD for the standard matrix A of T, say
T
AUV
. Let
1
{, , }
n
B=…vv
and
1
{, , }
m
C=…uu
be bases for
n
and
m
constructed respectively from the columns of V and U. Since
the columns of V are orthogonal,
T
jj
V=ve
, where
j
e
is the jth column of the n × n identity matrix.
26. [M] Let
18 13 4 4
219 412
.
A
−−
⎡⎤
⎢⎥
⎢⎥
=⎢⎥
Then
528 392 224 176
392 1092 176 536 ,
T
AA
−−
−−
=
and the
eigenvalues of
T
AA
are found to be (in decreasing order)
1
1600,=
2
400,=
3
100,=
and
4
0.=
Associated unit eigenvectors may be computed:
.4 .8 .4 .2
−−
⎡⎤ ⎡ ⎡⎤ ⎡⎤
page-pf4
442 CHAPTER 7 Symmetric Matrices and Quadratic Forms
.4 .8 .4 .2
.8 .4 .2 .4 .
−−
⎡⎤
⎢⎥
−−
⎢⎥
40 0 0 0
020 00
.
Σ=
11 2 2
12
.5 .5
.5 .5
11
,,
.5 .5
AA
σσ
⎡⎤ ⎡ ⎤
⎢⎥ ⎢ ⎥
⎢⎥ ⎢ ⎥
== = =
⎢⎥ ⎢ ⎥
uv u v
Because Av
4
= 0, only three columns of U have been found so far. The last column of U can be found
by extending {u
1
, u
2
, u
3
} to an orthonormal basis for
4
. The vector u
4
must satisfy the set of
equations
1
0,
T
=ux
2
0,
T
=ux
and
3
0.
T
=ux
These are equivalent to the linear equations
1234
1.5
01.5
xxxx
−−
⎡⎤ ⎡ ⎤
+++= ⎢⎥ ⎢ ⎥
−−
Therefore, let
.5 .5 .5 .5
.5 .5 .5 .5 .
.5 .5 .5 .5
.5 .5 .5 .5
U
−−−
⎡⎤
⎢⎥
⎢⎥
=⎢⎥
⎢⎥
⎢⎥
⎣⎦
Thus
.5 .5 .5 .5 40 0 0 0 .4 .8 .2 .4
−−− − −
⎡⎤
⎢⎥
page-pf5
7.4 • Solutions 443
27. [M] Let
68454
27564
.
A
−− −
⎡⎤
⎢⎥
−−
⎢⎥
=⎢⎥
Then
41 32 38 14 8
32 118 3 92 74
,
38 3 121 10 52
T
AA
−− −
−−
=−− −
and the
eigenvalues of
T
AA
are found to be (in decreasing order)
1
270.87,=
2
147.85,=
3
23.73,=
4
18.55,=
and
5
0.=
Associated unit eigenvectors may be computed:
.10 .39 .74 .41 .36
.61 .29 .27 .50 .48
−−− −
⎡⎤ ⎡⎤⎡⎤ ⎡⎤⎡⎤
⎢⎥ ⎢⎥⎢⎥ ⎢⎥⎢⎥
−−
⎢⎥ ⎢⎥⎢⎥ ⎢⎥⎢⎥
Thus one choice for V is
.
.21 .84 .07 .45 .19
.52 .14 .38 .23 .72
.55 .19 .49 .58 .29
V
⎢⎥
⎢⎥
=−−
⎢⎥
−− −
⎢⎥
⎢⎥
−−
⎣⎦
The nonzero singular values of A
16.46 0 0 0 0
0004.310
⎡⎤
⎢⎥
⎢⎥
⎣⎦
.57 .65
.51 .34
−−
⎡⎤ ⎡
⎢⎥ ⎢
⎢⎥ ⎢
⎣⎦ ⎣
.42 .27
⎤⎡
⎦⎣
.57 .65 .42 .27
.63 .24 .68 .29 .
−−
−−
page-pf6
444 CHAPTER 7 Symmetric Matrices and Quadratic Forms
.10 .61 .21 .52 .55
.57 .65 .42 .27 16.46 0 0 0 0 .39 .29 .84 .14 .19
.63 .24 .68 .29 0 12.16 0 0 0
−−
−−
⎡⎤⎡ ⎤
−−
⎢⎥⎢ ⎥
−−
4037
−−
⎡⎤
102 91 0 108
T
AA
are found to be (in decreasing order)
1
649.9059,=
2
218.0033,=
3
39.6345,=
and
531 7 9
642 8 8
⎡⎤
⎢⎥
255 168 90 160 47
168 111 60 104 30
eigenvalues of
T
AA
are found to be (in decreasing order)
1
672.589,=
2
280.745,=
3
127.503,=
4
1.163,=
and
7
5
1.428 10 .
The singular values of A are thus
1
25.9343,
σ
=
7.5 SOLUTIONS
Notes:
The application presented here has turned out to be of interest to a wide variety of students,
including engineers. I cover this in Course Syllabus 3 described in the front mater of the text, but I only
have time to mention the idea briefly to my other classes.
1. The matrix of observations is
19 22 6 3 2 20
12 6 9 15 13 5
X⎡⎤
=⎢⎥
⎣⎦
and the sample mean is
page-pf7
7.5 • Solutions 445
2. The matrix of observations is
1 526 7 3
311681511
X⎡⎤
=⎢⎥
⎣⎦
and the sample mean is
24 4
1.
54 9
6
M⎡⎤
==
⎢⎥
⎣⎦
The mean-deviation form B is obtained by subtracting M from each column of X,
3. The principal components of the data are the unit eigenvectors of the sample covariance matrix S.
One computes that (in descending order) the eigenvalues of
86 27
27 16
S
=
are
1
95.2041=
and
4. The principal components of the data are the unit eigenvectors of the sample covariance matrix S.
One computes that (in descending order) the eigenvalues of
5.6 8
818
S
=
are
1
21.9213=
and
5. [M] The largest eigenvalue of
164.12 32.73 81.04
32.73 539.44 249.13
81.04 249.13 189.11
S
⎡⎤
⎢⎥
=⎢⎥
⎢⎥
⎣⎦
is
1
677.497,=
and the first
principal component of the data is the unit eigenvector corresponding to
1
, which is
page-pf8
446 CHAPTER 7 Symmetric Matrices and Quadratic Forms
6. [M] The largest eigenvalue of
29.64 18.38 5.00
18.38 20.82 14.06
5.00 14.06 29.21
S
⎡⎤
⎢⎥
=⎢⎥
⎢⎥
⎣⎦
is
1
51.6957,=
and the first principal
7. Since the unit eigenvector corresponding to
1
95.2041=
is
1
.946515 ,
.322659
=
u
one choice for the
8. Since the unit eigenvector corresponding to
1
21.9213=
is
1
.44013 ,
.897934
=
u
one choice for the new
9. The largest eigenvalue of
520
262
027
S
⎡⎤
⎢⎥
=⎢⎥
⎢⎥
is
1
9,=
and the first principal component of the data is
10. [M] The largest eigenvalue of
542
4114
245
S
⎡⎤
⎢⎥
=⎢⎥
⎢⎥
⎣⎦
is
1
15,=
and the first principal component of the
11. a. If w is the vector in
N
with a 1 in each position, then
[]
11NN
…=++=XXwXX0
since
the
k
X
are in mean-deviation form. Then
page-pf9
Chapter 7 • Supplementary Exercises 447
b. By part a., the covariance matrix
S
Y
of
1
,,
N
YY
is
[][]
11
1
1
T
NN
SN
=… …
Y
YYYY
12. By Exercise 11, the change of variables X = PY changes the covariance matrix S of X into the
covariance matrix
T
PSP
of Y. The total variance of the data as described by Y is
tr( ).
T
PSP
13. Let M be the sample mean for the data, and let
ˆ.
kk
=−XXM
Let
1
ˆˆ
N
B
=…
XX
be the
matrix of observations in mean-deviation form. By the row-column expansion of
,
T
BB
the sample
covariance matrix is
1
T
SBB
=
Chapter 7 SUPPLEMENTARY EXERCISES
1. a. True. This is just part of Theorem 2 in Section 7.1. The proof appears just before the statement
of the theorem.
b. False. A counterexample is
01
.
10
A
⎡⎤
=⎢⎥
⎣⎦
page-pfa
448 CHAPTER 7 Symmetric Matrices and Quadratic Forms
e. False. A counterexample is
11
.
11
P
⎡⎤
=⎢⎥
⎣⎦
The columns here are orthogonal but not
g. False. A counterexample is
20
03
A⎡⎤
=⎢⎥
⎣⎦
and
1.
0
=
x
Then
20
T
A=>xx
, but
T
Axx
is an
indefinite quadratic form.
h. True. This is basically the Principal Axes Theorem from Section 7.2. Any quadratic form can be
l. False. The term “definite eigenvalue” is undefined and therefore meaningless.
m. True. If x = Py, then
1
()()
TT TTT
APAP PAP PAP
===xx y y y yy y
.
p. True. Theorem 10 in Section 7.4 writes the decomposition in the form
,
T
UVΣ
where U and V
are orthogonal matrices. In this case,
T
V
is also an orthogonal matrix. Proof: Since V is
orthogonal, V is invertible and
1
.
T
VV
=
Then
11
() ( ) (),
TTTT
VVV
−−
==
and since V is square
2. a. Each term in the expansion of A is symmetric by Exercise 35 in Section 7.1. The fact that
()
TT T
BC B C+=+
implies that any sum of symmetric matrices is symmetric, so A is
symmetric.
page-pfb
3. If rank A = r, then dim
Nul A = n r by the Rank Theorem. So 0 is an eigenvalue of A with
multiplicity n r, and of the n terms in the spectral decomposition of A exactly n r are zero. The
4. a. By Theorem 3 in Section 6.1,
(Col ) Nul Nul
T
AAA
==
since
.
T
AA=
5. If Av = λv for some nonzero λ, then
11
(),AA
−−
==vv v
which shows that v is a linear
combination of the columns of A.
6. Because A is symmetric, there is an orthonormal eigenvector basis
1
{, , }
n
uu
for
n
. Let r = rank A.
If r = 0, then A = O and the decomposition of Exercise 4(b) is y = 0 + y for each y in
n
; if r = n then
the decomposition is y = y + 0 for each y in
n
.
Assume that 0 < r < n. Then dim
Nul A = n r by the Rank Theorem, and so 0 is an eigenvalue of A
with multiplicity n r. Hence there are r nonzero eigenvalues, counted according to their
7. If
T
ARR=
and R is invertible, then A is positive definite by Exercise 25 in Section 7.2.
Conversely, suppose that A is positive definite. Then by Exercise 26 in Section 7.2,
T
ABB=
for
8. Suppose that A is positive definite, and consider a Cholesky factorization of
T
ARR=
with R upper
triangular and having positive entries on its diagonal. Let D be the diagonal matrix whose diagonal
9. If A is an m × n matrix and x is in
n
, then
2
()()|| || 0.
TT T
AA A A A==xxxx x
Thus
T
AA
is positive
page-pfc
10. If rank G = r, then dim
Nul G = n r by the Rank Theorem. Hence 0 is an eigenvalue of G with
multiplicity n r, and the spectral decomposition of G is
111
TT
rrr
G=++uu uu
11. Let
T
AUV
be a singular value decomposition of A. Since U is orthogonal,
T
UU I=
and
12. a. Because the columns of
r
V
are orthonormal,
11
()( )( )
TT TT
rr r r r r rr
AA UDV VDU UDDU UU
+−
===yyyy
b . Because the columns of
r
U
are orthonormal,
11
()()( )
TT T T
rrrr r r rr
A A VD U UDV VD DV VV
+− −
===xxxx
c . Using the reduced singular value decomposition, the definition of
A
+
, and the associativity of
matrix multiplication gives:
11
()( )()( )()
TTT TT
rr r r rr r r rr
AA A U DV V D U U DV U DD U U DV
+− −
==
13. a. If b = Ax, then
.AAA
++ +
==xb x
By Exercise 12(a),
+
x
is the orthogonal projection of x onto
Row A.
page-pfd
Chapter 7 • Supplementary Exercises 451
14. The least-squares solutions of Ax = b are precisely the solutions of Ax =
ˆ,b
where
ˆ
b
is the
orthogonal projection of b onto Col A. From Exercise 13, the minimum length solution of Ax =
ˆ
b
is
15. [M] The reduced SVD of A is
,
T
rr
AUDV=
where
.966641 .253758 .034804 9.84443 0 0
.185205 .786338 .589382 , 0 2.62466 0 ,
⎡⎤
⎢⎥
−−
⎢⎥
.313388 .009549 .633795
.313388 .009549 .633795
and .633380 .023005 .313529
r
V
⎡⎤
⎢⎥
⎢⎥
⎢⎥
=−−
⎣⎦
So the pseudoinverse
1T
rr
AVDU
+−
=
may be calculated, as well as the solution
ˆA
+
=xb
for the
system Ax = b:
.05 .35 .325 .325
.05 .35 .325 .325
−− .7
⎡⎤
⎢⎥
−− .7
Row reducing the augmented matrix for the system
ˆ
T
A=zx
shows that this system has a solution, so
01
01
⎤⎡ ⎤
⎥⎢ ⎥
Nul A is
12
.cd=+ua a
One computes that
ˆ
|| ,
||
=131/50x
while
ˆ
|| .cd
22
+
||
= (131/50) + 2 + 2xu
16. [M] The reduced SVD of A is
,
T
rr
AUDV=
where
.337977 .936307 .095396 12.9536 0 0
.591763 .290230 .752053 , 0 1.44553 0 ,
UD
⎡⎤
⎢⎥
⎢⎥
==
page-pfe
452 CHAPTER 7 Symmetric Matrices and Quadratic Forms
.690099 .721920 .050939
00 0
⎡⎤
⎢⎥
So the pseudoinverse
1T
rr
AVDU
+−
=
may be calculated, as well as the solution
ˆA
+
=xb
for the
system Ax = b:
.5 0 .05 .15
00 0 0
−− 2.3
⎡⎤
⎢⎥
0
Row reducing the augmented matrix for the system
ˆ
T
A=zx
shows that this system has a solution, so
00
10
⎤⎡
⎥⎢

Trusted by Thousands of
Students

Here are what students say about us.

Copyright ©2022 All rights reserved. | CoursePaper is not sponsored or endorsed by any college or university.