Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Monte Carlo Inference and Likelihood Weighting in Probabilistic Reasoning - Prof. Milos Ha, Assignments of Computer Science

An introduction to monte carlo inference and likelihood weighting, two techniques used in probabilistic reasoning. Monte carlo approaches approximate probabilities using sample frequencies, while likelihood weighting generates samples consistent with evidence and assigns weights to each sample. Examples using bayesian belief networks (bbn) and burglary, earthquake, and alarm domains.

Typology: Assignments

Pre 2010

Uploaded on 09/02/2009

koofers-user-95a
koofers-user-95a 🇺🇸

10 documents

1 / 28

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
1
CS 1571 Intro to AI M. Hauskrecht
CS 1571 Introduction to AI
Lecture 24
Milos Hauskrecht
milos@cs.pitt.edu
5329 Sennott Square
Monte Carlo Inference
Decision making in the presence of
uncertaity
CS 1571 Intro to AI M. Hauskrecht
Monte Carlo approaches
MC approximation:
The probability is approximated using sample frequencies
Example:
BBN sampling:
One sample gives one assignment of values to all variables
N
N
TJTBP TJTB ==
=== ,
),(
~
TJTBwithsamples =
=
,#
samplestotal #
M
A
B
J
E
Generate sample in a
top down manner, following
the links
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c

Partial preview of the text

Download Monte Carlo Inference and Likelihood Weighting in Probabilistic Reasoning - Prof. Milos Ha and more Assignments Computer Science in PDF only on Docsity!

CS 1571 Intro to AI M. Hauskrecht

CS 1571 Introduction to AI

Lecture 24

Milos Hauskrecht

milos@cs.pitt.edu

5329 Sennott Square

Monte Carlo Inference

Decision making in the presence of

uncertaity

Monte Carlo approaches

  • MC approximation :
    • The probability is approximated using sample frequencies
    • Example:
  • BBN sampling:
  • One sample gives one assignment of values to all variables

N

N

P ( B = T , J = T )= B^ = T , J = T

# samples with B = T , J = T

total # samples

M

A

B

J

E

Generate sample in a
top down manner, following
the links

CS 1571 Intro to AI M. Hauskrecht

BBN sampling example

Burglary Earthquake
JohnCalls MaryCalls
Alarm
B E T F
T T 0.95 0.
T F 0.94 0.
F T 0.29 0.
F F 0.001 0.
P (B)
P (E)
A T F
T 0.90 0.
F 0.05 0.
A T F
T 0.7 0.
F 0.01 0.
P (A|B,E)
P (J|A) P (M|A)
T F T F

BBN sampling example

Burglary Earthquake
JohnCalls MaryCalls
Alarm
B E T F
T T 0.95 0.
T F 0.94 0.
F T 0.29 0.
F F 0.001 0.
P (B)
P (E)
A T F
T 0.90 0.
F 0.05 0.
A T F
T 0.7 0.
F 0.01 0.
P (A|B,E)
P (J|A) P (M|A)
T F T F

F

CS 1571 Intro to AI M. Hauskrecht

BBN sampling example

Burglary Earthquake
JohnCalls MaryCalls
Alarm
B E T F
T T 0.95 0.
T F 0.94 0.
F T 0.29 0.
F F 0.001 0.
P (B)
P (E)
A T F
T 0.90 0.
F 0.05 0.
A T F
T 0.7 0.
F 0.01 0.
P (A|B,E)
P (J|A) P (M|A)
T F T F

F F

F

F

BBN sampling example

Burglary Earthquake
JohnCalls MaryCalls
Alarm
B E T F
T T 0.95 0.
T F 0.94 0.
F T 0.29 0.
F F 0.001 0.
P (B)
P (E)
A T F
T 0.90 0.
F 0.05 0.
A T F
T 0.7 0.
F 0.01 0.
P (A|B,E)
P (J|A) P (M|A)
T F T F

F F

F

F F

CS 1571 Intro to AI M. Hauskrecht

BBN sampling example

Burglary Earthquake
JohnCalls MaryCalls
Alarm
B E T F
T T 0.95 0.
T F 0.94 0.
F T 0.29 0.
F F 0.001 0.
P (B)
P (E)
A T F
T 0.90 0.
F 0.05 0.
A T F
T 0.7 0.
F 0.01 0.
P (A|B,E)
P (J|A) P(M|A)
T F T F

F F

F

F F

Sample:

F F

F

F F

Monte Carlo approaches

  • MC approximation of conditional probabilities :
    • The probability is approximated using sample frequencies
    • Example:
  • Rejection sampling:
    • Generate samples from the full joint by sampling BBN
    • Use only samples that agree with the condition, the

remaining samples are rejected

  • Problem: many samples can be rejected

J T

BTJT

N

N

P B T J T

=

= =

,

# samples with B = T , J = T

# samples withJ = T

CS 1571 Intro to AI M. Hauskrecht

BBN likelihood weighting example

Burglary Earthquake
JohnCalls MaryCalls
Alarm
B E T F
T T 0.95 0.
T F 0.94 0.
F T 0.29 0.
F F 0.001 0.
P (B)
P (E)
A T F
T 0.90 0.
F 0.05 0.
A T F
T 0.7 0.
F 0.01 0.
P (A|B,E)
P (J|A) P (M|A)
T F T F

T F

J = T (set !!!) M = F (set !!!)

BBN likelihood weighting example

Burglary Earthquake
JohnCalls MaryCalls
Alarm
B E T F
T T 0.95 0.
T F 0.94 0.
F T 0.29 0.
F F 0.001 0.
P (B)
P (E)
A T F
T 0.90 0.
F 0.05 0.
A T F
T 0.7 0.
F 0.01 0.
P (A|B,E)
P (J|A) P (M|A)
T F T F

T F

T

J = T (set !!!) M = F (set !!!)

CS 1571 Intro to AI M. Hauskrecht

BBN likelihood weighting example

Burglary Earthquake
JohnCalls MaryCalls
Alarm
B E T F
T T 0.95 0.
T F 0.94 0.
F T 0.29 0.
F F 0.001 0.
P (B)
P (E)
A T F
T 0.90 0.
F 0.05 0.
A T F
T 0.7 0.
F 0.01 0.
P (A|B,E)
P (J|A) P (M|A)
T F T F

T F

T

J = T (set !!!) M = F (set !!!)

BBN likelihood weighting example

Burglary Earthquake
JohnCalls MaryCalls
Alarm
B E T F
T T 0.95 0.
T F 0.94 0.
F T 0.29 0.
F F 0.001 0.
P (B)
P (E)
A T F
T 0.90 0.
F 0.05 0.
A T F
T 0.7 0.
F 0.01 0.
P (A|B,E)
P (J|A) P (M|A)
T F T F

T F

T

J = T (set !!!) M = F (set !!!)

CS 1571 Intro to AI M. Hauskrecht

BBN likelihood weighting example

Burglary Earthquake
JohnCalls MaryCalls
Alarm
B E T F
T T 0.95 0.
T F 0.94 0.
F T 0.29 0.
F F 0.001 0.
P (B)
P (E)
A T F
T 0.90 0.
F 0.05 0.
A T F
T 0.7 0.
F 0.01 0.
P (A|B,E)
P (J|A) P (M|A)
T F T F

F

J = T (set !!!) M = F (set !!!)

Second sample

BBN likelihood weighting example

Burglary Earthquake
JohnCalls MaryCalls
Alarm
B E T F
T T 0.95 0.
T F 0.94 0.
F T 0.29 0.
F F 0.001 0.
P (B)
P (E)
A T F
T 0.90 0.
F 0.05 0.
A T F
T 0.7 0.
F 0.01 0.
P (A|B,E)
P (J|A) P (M|A)
T F T F

F F

J = T (set !!!) M = F (set !!!)

Second sample

CS 1571 Intro to AI M. Hauskrecht

BBN likelihood weighting example

Burglary Earthquake
JohnCalls MaryCalls
Alarm
B E T F
T T 0.95 0.
T F 0.94 0.
F T 0.29 0.
F F 0.001 0.
P (B)
P (E)
A T F
T 0.90 0.
F 0.05 0.
A T F
T 0.7 0.
F 0.01 0.
P (A|B,E)
P (J|A) P (M|A)
T F T F

F F

F

J = T (set !!!) M = F (set !!!)

Second sample

BBN likelihood weighting example

Burglary Earthquake
JohnCalls MaryCalls
Alarm
B E T F
T T 0.95 0.
T F 0.94 0.
F T 0.29 0.
F F 0.001 0.
P (B)
P (E)
A T F
T 0.90 0.
F 0.05 0.
A T F
T 0.7 0.
F 0.01 0.
P (A|B,E)
P (J|A) P (M|A)
T F T F

F F

F

J = T (set !!!) M = F (set !!!)

Second sample

CS 1571 Intro to AI M. Hauskrecht

BBN likelihood weighting example

Burglary Earthquake
JohnCalls MaryCalls
Alarm
B E T F
T T 0.95 0.
T F 0.94 0.
F T 0.29 0.
F F 0.001 0.
P (B)
P (E)
A T F
T 0.90 0.
F 0.05 0.
A T F
T 0.7 0.
F 0.01 0.
P (A|B,E)
P (J|A) P (M|A)
T F T F

F F

F

J = T (set !!!) M = F (set !!!)

0.001 0.999 Earthquake
P (A|B,E)

weight = 0.05*0.99=0.

Evidence J=T,M=F

in combination with B=F,E=F,A=F

Second sample

Likelihood weighting

  • Assume we have generated the following M samples:

F F

F

T F

F F

F

T F

T F

F

T F

F F

F

T F

How to make the samples consistent?

Weight each sample by probability with which it agrees with the

conditioning evidence P(e).

M

F F

F

T F

Weight 0.

T F

F

T F

Weight 0.

CS 1571 Intro to AI M. Hauskrecht

Decision-making in the presence of

uncertainty

Decision-making in the presence of

uncertainty

  • Computing the probability of some event may not be our

ultimate goal

  • Instead we are often interested in making decisions about

our future actions so that we satisfy goals

  • Example: medicine
    • Diagnosis is typically only the first step
    • The ultimate goal is to manage the patient in the best

possible way. Typically many options available:

  • Surgery, medication, collect the new info (lab test)
  • There is an uncertainty in the outcomes of these

procedures: patient can be improve, get worse or even

die as a result of different management choices.

CS 1571 Intro to AI M. Hauskrecht

Decision making example.

Assume we want to invest $100 for 6 months

  • We have 4 choices:

1. Invest in Stock 1

2. Invest in Stock 2

3. Put money in bank

4. Keep money at home

Stock 1

Stock 2

Bank Stock 1 value can go up or down :

Up: with probability 0.

Down: with probability 0.

Monetary

Outcomes

for up and

down states

Home

(up)

(down)

Decision making example.

Investing of $100 for 6 months

Stock 1

Stock 2

Bank

Monetary

outcomes

for different

states

Home

(up)

(down)

(up)

(down)

CS 1571 Intro to AI M. Hauskrecht

Decision making example.

We need to make a choice whether to invest in Stock 1 or 2, put

money into bank or keep them at home. But how?

Stock 1

Stock 2

Bank

Monetary

outcomes

for different

scenarios

Home

?

(up)

(down)

(up)

(down)

Decision making example.

Assume the simplified problem with the Bank and Home

choices only.

The result is guaranteed – the outcome is deterministic

What is the rational choice assuming our goal is to make

money?

Bank (^101)

Home

CS 1571 Intro to AI M. Hauskrecht

Decision making. Stochastic outcome

Stock 1

Stock 2

Bank 101

• How to quantify the goodness of the stochastic outcome?

We want to compare it to deterministic and other

stochastic outcomes.

Idea: Use the expected value of the outcome

(up)

(down)

Expected value

Stock 1

• Let X be a random variable representing the monetary

outcome with a discrete set of values.

• Expected value of X is:

Intuition: Expected value summarizes all stochastic

outcomes into a single quantity.

• Example:

• What is the expected value of the outcome of Stock 1

option?

∈Ω

x X

E ( X ) xP ( X x )

Ω X

CS 1571 Intro to AI M. Hauskrecht

Expected value

Stock 1

• Let X be a random variable representing the monetary

outcome with a discrete set of values.

• Expected value of X is:

• Expected value summarizes all stochastic outcomes

into a single quantity

• Example:

Expected value for the outcome of the Stock 1 option is:

∈Ω

x X

E ( X ) xP ( X x )

Ω X

0. 6 × 110 + 0. 4 × 90 = 66 + 36 = 102

Expected values

Investing $100 for 6 months

Stock 1

Stock 2

Bank

Home

× + × =

(up)

(down)

(up)

(down)