This document is partially blurred.
Unlock all pages and 1 million more documents.
Get Access
1
Solutions To Problems of Chapter 17
17.1. Let
µ:= E[f(x)] = Zf(x)p(x)dx
and q(x) be the proposal distribution. Show that if
w(x) := p(x)
q(x),
and
N
X
have that ˆ
µis unbiased, hence
E[ˆ
µ] = µ.
where E[·] is with respect to q. Hence,
E(ˆ
µ−µ)2=E[(ˆ
µ)2] + µ2−2µE[ˆ
µ] = E[(ˆ
µ)2]−µ2.
However
N
X
E[(ˆ
µ)2] = 1
N2 NZp2(x)f2(x)
q(x)dx+ (N2−N)Zp(x)f(x)
q(x)q(x)dx2!
2
17.2. In Importance sampling, with weights defined as
w(x) = φ(x)
q(x),
where
17.3. Show that using resampling in importance sampling, then as the number
of particles tends to infinity the approximating, by the respective discrete
random measure, distribution, ¯p, tends to the true (desired) one, p.
Hint: Consider the one-dimensional case.
Solution: Recall from the text that after resampling each resampled parti-
17.4. Show that in sequential importance sampling, the proposal distribution
that minimizes the variance of the weight at time n, conditioned on x1:n−1,
is given by
qopt
n(xn|x1:n−1) = pn(xn|x1:n−1)
Solution: Note from the text that we have
w(x1:n) = w(x1:n−1)φn(x1:n)
φn−1(x1:n−1)qn(xn|x1:n−1)
from the respective definition
Also,
n(x1:n)
φ2
n(x1:n)
q2
n(xn|x1:n−1)=λ⇒
qn(xn|x1:n−1) = λφn(x1:n)⇒
17.5. In a sequential importance sampling task, let
pn(x1:n) =
n
Y
k=1
N(xk|0,1)
φn(x1:n) =
n
Y
exp −x2
k
2
var[ˆ
Zn] = Z2
n
N σ4
2σ2−1
n
2
−1!.
Observe that for σ2>1/2, which is the range of values for which the
above formula makes sense and guarantees a finite value for the variance,
Z2
n
k=1
and
qn(x1:n) = 1
(2π)n
2(σ2)n
2
n
Y
exp −x2
k
2σ2.
N σ4
2σ2−1
17.6. Prove that the use of the optimal proposal distribution in particle filtering
leads to
wn(x1:n) = wn−1(x1:n−1)p(yn|xn−1).
Solution: Recall that the general update recursion is given by
Trusted by Thousands of
Students
Here are what students say about us.
Resources
Company
Copyright ©2022 All rights reserved. | CoursePaper is not sponsored or endorsed by any college or university.