Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Solutions to Assignment 6 - Statistical Signal Processing | ECE 567, Assignments of Electrical and Electronics Engineering

Material Type: Assignment; Class: Statistical Signal Processing; Subject: Electrical and Computer Engr; University: Illinois Institute of Technology; Term: Spring 2008;

Typology: Assignments

Pre 2010

Uploaded on 08/18/2009

koofers-user-r6y
koofers-user-r6y 🇺🇸

10 documents

1 / 4

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
ECE 567 STATISTICAL SIGNAL PROCESSING SPRING 2008
Homework Assignment #6
Solutions
1. We have
p(x;θ) = (2πσ2)N /2exp
1
2σ2
N1
X
n=0 "x(n)
p1
X
k=0
Aknk#2
ln(p(x;θ)) = N
2ln(2πσ2)1
2σ2
N1
X
n=0 "x(n)
p1
X
k=0
Aknk#2
ln(p(x;θ))
∂A`
=1
σ2
N1
X
n=0 Ãx(n)
p1
X
k=0
Aknk!n`
2ln(p(x;θ))
∂AmA`
=1
σ2
N1
X
n=0
nmn`
E·2ln(p(x;θ))
∂AmA`¸=1
σ2
N1
X
n=0
nm+`
so that
I(θ) = 1
σ2
NPN1
n=0 nPN1
n=0 n2· · · PN1
n=0 np1
PN1
n=0 n· · · .
.
.
.
.
.
PN1
n=0 np1PN1
n=0 np· · · · · · PN1
n=0 n2p2
.
2. We have
x=
x(0)
.
.
.
x(N1)
N
x(0)
.
.
.
x(N1)
| {z }
µ
,
C0··· 0
0C.
.
.
.
.
....0
0· · · 0C
| {z }
C
Using (3.32) from the text we have
I(ρ) = 1
2tr "µC1(ρ)C(ρ)
∂ρ 2#
pf3
pf4

Partial preview of the text

Download Solutions to Assignment 6 - Statistical Signal Processing | ECE 567 and more Assignments Electrical and Electronics Engineering in PDF only on Docsity!

ECE 567 STATISTICAL SIGNAL PROCESSING SPRING 2008

Homework Assignment #

Solutions

  1. We have

p(x; θ) = (2πσ

2 )

−N/ 2 exp

2 σ

2

N∑ − 1

n=

[

x(n) −

p− 1 ∑

k=

Akn

k

] 2 

ln(p(x; θ)) = −

N

ln(2πσ

2 ) −

2 σ^2

N − 1 ∑

n=

[

x(n) −

p− 1 ∑

k=

Akn

k

] 2

∂ ln(p(x; θ))

∂A`

σ^2

N − 1 ∑

n=

x(n) −

p− 1 ∑

k=

Akn

k

n

`

2 ln(p(x; θ))

∂AmA`

σ^2

N − 1 ∑

n=

n

m n

`

−E

[

2 ln(p(x; θ))

∂AmA`

]

σ

2

N − 1 ∑

n=

n

m+`

so that

I(θ) =

σ

2

N

∑N − 1

n=0 n^

∑N − 1

n=0 n

2 · · ·

∑N − 1

n=0 n

p− 1

∑N − 1

n=

n · · ·

∑N − 1

n=

n

p− 1

∑N − 1

n=

n

p · · · · · ·

∑N − 1

n=

n

2 p− 2

  1. We have

x =

x(0)

. . .

x(N − 1)

 ∼ N

x(0)

. . .

x(N − 1)

μ

C 0 · · · 0

0 C

0 · · · 0 C

C

Using (3.32) from the text we have

I(ρ) =

tr

[

C

− 1 (ρ)

∂C(ρ)

∂ρ

]

since the mean doesn’t depend on ρ. Then

∂C(ρ)

∂ρ

[

]

[

]

       C

− 1 (ρ) =

1 1 −ρ^2

[

1 −ρ

−ρ 1

]

1 1 −ρ^2

[

1 −ρ

−ρ 1

]

       C

− 1 (ρ)

∂C(ρ)

∂ρ

1 1 −ρ^2

[

−ρ 1

1 −ρ

]

1 1 −ρ^2

[

−ρ 1

1 −ρ

]

       ( C

− 1 (ρ)

∂C(ρ)

∂ρ

1 (1−ρ^2 )^2

[

1 + ρ

2 − 2 ρ

− 2 ρ 1 + ρ

2

]

1 (1−ρ^2 )^2

[

1 + ρ

2 − 2 ρ

− 2 ρ 1 + ρ

2

]

and then

1

tr

[

C

− 1 (ρ)

∂C(ρ)

∂ρ

]

N (1 + ρ

2 )

(1 − ρ

2 )

2

so that finally

var[ˆρ] ≥

(1 − ρ

2 )

2

N (1 + ρ^2 )

  1. The data model is

x(0)

. . .

x(N − 1)

x

r 1 r 2 · · · rp

. . .

r

N − 1 1 r

N − 1 2 r

N − 1 p

H

A 1

A 2

Ap

w(0)

. . .

w(N − 1)

and then

θˆ = (HT^ H)−^1 HT^ x

Cˆθ = σ

2 (H

T H)

− 1 .

(b) We get

Cb = σ

2 (H

T H)

− 1

θ

= σ

2

[

]

(H

T H)

− 1

[

]

(c)

θ = 3. 0125