This document is partially blurred.
Unlock all pages and 1 million more documents.
Get Access
1
Solutions To Problems of Chapter 19
19.1. Show that the second principal component in PCA is given as the eigen-
vector corresponding to the second largest eigenvalue.
Solution As pointed out in the text, the following optimization task is
in order,
maximize vTˆ
Σv,(1)
19.2. Show that the pair of directions, associated with CCA, which maximize
the respective correlation coefficient, satisfy the following pair of relations,
Σxyuy=λΣxxux,
Σyxux=λΣyy uy.
Solution: The corresponding Lagrangian is given by
L(ux,uy, λx, λy) = uT
xΣxyuy−λx
2uT
xΣxxux−1−λy
2uT
yΣyyuy−1,
19.3. Establish the arguments that verify the convergence of the k-SVD.
Solution: Let us first assume that we can perform the sparse coding stage
perfectly; that is, we can retrieve the best approximation to xn, n =
19.4. Prove that (19.83) and (19.89) are the same.
Proof: We have that
ˆ
z=1
σ2Σz|xATx=σ2
σ2σ2I+ATA−1ATx,
or
19.5. Show that the ML PPCA tends to PCA as σ2→0.
Solution: Recall that PCA relies on the transformation
y=˜
ATx,
3
˜
AT.
19.6. Show Eqs. (19.91)-(19.92)
Solution: Our starting point are the relations given in the text,
N
X
2ln β+1
2kµ(j)
z|x(n)k2+1
2trace{Σ(j)
z|x}
2kxn−Aµ(j)
2trace{AΣ(j)
(5)
where Cis a constant and
µ(j)
z|x(n) = β(j)Σ(j)
z|xA(j)Txn, Σ(j)
z|x= (I+β(j)A(j)TA(j))−1.
Eq. (5) is rewritten as
N
X
Q(A, β;A(j), β(j)) = lN ln β
2−β
2
N
X
n=1
xT
nxn+βtrace (AT
N
X
n=1
xnµT(n))
−β
X
µ(n)µT(n)!AT)−Nβ
19.7. Show equation (19.102).
Solution: Our starting point is the eigenvalue/eigenvector equation
Σu=λu,(7)
where
N
X
19.8. Show that the number of degrees of freedom of a rank rmatrix is equal
to r(l1+l2)−r2.
Solution: A rank rmatrix can be expressed via SVD as in (19.119). The
Trusted by Thousands of
Students
Here are what students say about us.
Resources
Company
Copyright ©2022 All rights reserved. | CoursePaper is not sponsored or endorsed by any college or university.