Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Lecture 6: Expected Value, Variance, and Covariance of Random Variables, Study notes of Mathematical Statistics

A lecture note from Sta230/Mth230 course taught by Colin Rundel at the University of California, Berkeley. The lecture covers the concepts of random variables, expected value, variance, and covariance. It includes definitions, properties, and examples of discrete and continuous random variables such as Bernoulli, Binomial, Hypergeometric, and Poisson distributions.

What you will learn

  • What is the difference between variance and standard deviation?
  • What is a random variable?
  • How is the expected value of a random variable calculated?

Typology: Study notes

2021/2022

Uploaded on 09/27/2022

eekbal
eekbal 🇺🇸

4.6

(30)

264 documents

1 / 7

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Lecture 6: E(X), Var(X), & Cov (X,Y)
Sta230/Mth230
Colin Rundel
February 5, 2014
Chapter 3.1-3.3 Random Variables
Random Variables
We have been using them for a while now in a variety of forms but it is
good to explicitly define what we mean
Random Variable
A real-valuedfunction on the sample space
Example: If is the 36 element space resulting from rolling two fair
six-sided dies (rand g), then the following are all random variables
X(r,g) = r
Y(r,g) = |rg|
Z(r,g) = r+g
Sta230/Mth230 (Colin Rundel) Lecture 6 February 5, 2014 1 / 24
Chapter 3.1-3.3 Random Variables
Random Variables, cont.
Random variables are in essence a fancy way of describing an event.
Previous example:
= {(r,g):1r,g6}
Y(r,g) = |rg|
What is the event for P(Y= 1) in terms of ωΩ?
Sta230/Mth230 (Colin Rundel) Lecture 6 February 5, 2014 2 / 24
Chapter 3.1-3.3 Random Variables
Random Variables - Vocabulary
Range of a random variable
Set of all possible values
Distribution of a random variable
Specification of P(XA) for every set A.
If Xhas a countably large range then we can define f(x) = P(X=x)
as P(XA) = PxAf(x)
Sta230/Mth230 (Colin Rundel) Lecture 6 February 5, 2014 3 / 24
pf3
pf4
pf5

Partial preview of the text

Download Lecture 6: Expected Value, Variance, and Covariance of Random Variables and more Study notes Mathematical Statistics in PDF only on Docsity!

Lecture 6: E (X ), Var (X ), & Cov (X , Y )

Sta230/Mth

Colin Rundel

February 5, 2014

Chapter 3.1-3.3 Random Variables

Random Variables

We have been using them for a while now in a variety of forms but it is good to explicitly define what we mean

Random Variable A real-valued∗^ function on the sample space Ω

Example: If Ω is the 36 element space resulting from rolling two fair six-sided dies (r and g ), then the following are all random variables

X (r , g ) = r Y (r , g ) = |r − g | Z (r , g ) = r + g

Sta230/Mth230 (Colin Rundel) Lecture 6 February 5, 2014 1 / 24

Chapter 3.1-3.3 Random Variables

Random Variables, cont.

Random variables are in essence a fancy way of describing an event.

Previous example:

Ω = {(r , g ) : 1 ≤ r , g ≤ 6 } Y (r , g ) = |r − g |

What is the event for P(Y = 1) in terms of ω ∈ Ω?

Chapter 3.1-3.3 Random Variables

Random Variables - Vocabulary

Range of a random variable Set of all possible values

Distribution of a random variable Specification of P(X ∈ A) for every set A. If X has a countably large range then we can define f (x) = P(X = x) as P(X ∈ A) =

x∈A f^ (x)

Expected Value

The expected value of a random variable is defined as follows

Discrete Random Variable:

E [X ] =

all x

xP(X = x)

Continous Random Variable:

E [X ] =

all x

xP(X = x)dx

This is a natural generalization of what we do when deciding if a casino game is fair.

Sta230/Mth230 (Colin Rundel) Lecture 6 February 5, 2014 4 / 24

Properties of Expected Value

Constants - E (c) = c if c is constant

Indicators - E (IA) = P(A) where IA is an indicator function

Functions - E [g (X )] =

∫^ all^ x^ g^ (x)^ P(X^ =^ x)^ if discrete x g^ (x)^ P(X^ =^ x)^ dx^ if continuous

Constant Factors - E (cX ) = cE (x)

Addition - E (X + Y ) = E (X ) + E (Y )

Multiplication - E (XY ) = E (X )E (Y ) if X and Y are independent. Sta230/Mth230 (Colin Rundel) Lecture 6 February 5, 2014 5 / 24

Chapter 3.1-3.3 Random Variables

Variance

Another common property of random variables we are interested in is the Variance which measures the squared deviation from the mean.

Var (X ) = E

[

(X − E (X ))^2

]

= E (X − μ)^2 One common simplification:

Var (X ) = E (X − μ)^2 = E (X 2 − 2 μX + μ^2 ) = E (X 2 ) − 2 μE (X ) + μ^2 = E (X 2 ) − μ^2

Standard Deviation: SD(X ) =

Var (X )

Chapter 3.1-3.3 Random Variables

Properties of Variance

What is Var (aX + b) when a and b are constants?

Which gives us: Var (aX ) = a^2 Var (X ) Var (X + c) = Var (X ) Var (c) = 0

Properties of Variance, cont.

For a completely general formula for the variances of a linear combination of n random variables:

Var

( (^) n ∑

i=

ci Xi

∑^ n

i=

∑^ n

j=

Cov (ci Xi , cj Xj )

∑^ n

i=

c^2 i Var (Xi ) +

∑^ n

i=

∑^ n

j= i 6 =j

ci cj Cov (Xi , Xj )

Sta230/Mth230 (Colin Rundel) Lecture 6 February 5, 2014 12 / 24

Bernoulli Random Variable

Let X ∼ Bern(p), what is E (X ) and Var (X )?

Sta230/Mth230 (Colin Rundel) Lecture 6 February 5, 2014 13 / 24

Chapter 3.1-3.3 Random Variables

Binomial Random Variable

Let X ∼ Binom(n, p), what is E (X ) and Var (X )?

We can redefine X =

∑n i=1 Yi^ where^ Y^1 ,^ · · ·^ ,^ Yn^ ∼^ Bern(p), and since we are sampling with replacement all Yi and Yj are independent.

Chapter 3.1-3.3 Random Variables

Hypergeometric Random Variable - E (X )

Lets consider a simple case where we have an urn with m black marbles and N − m white marbles. Let Bi be an indicator variable for the ith marble being black.

Bi =

1 if ith draw is black 0 otherwise In the case where N = 2 and m = 1 what is P(Bi ) = 1 for all i?

Ω = {BW , WB}

P(B 1 ) = 1/ 2 , P(B 2 ) = 1/ 2

P(W 1 ) = 1/ 2 , P(W 2 ) = 1/ 2

Hypergeometric Random Variable - E (X ) - cont.

What about when N = 3 and m = 1?

Ω = {BW 1 W 2 , BW 2 W 1 , W 1 BW 2 , W 2 BW 1 , W 1 W 2 B, W 2 W 1 B}

P(B 1 ) = 1/ 3 , P(B 2 ) = 1/ 3 , P(B 3 ) = 1/ 3

P(W 1 ) = 2/ 3 , P(W 2 ) = 2/ 3 , P(W 3 ) = 2/ 3

Proposition

P(Bi = 1) = m/N for all i

Sta230/Mth230 (Colin Rundel) Lecture 6 February 5, 2014 16 / 24

Hypergeometric Random Variable - E (X ) - cont.

Let X ∼ Hypergeo(N, m, n) then X = B 1 + B 2 + · · · + Bn

Sta230/Mth230 (Colin Rundel) Lecture 6 February 5, 2014 17 / 24

Chapter 3.1-3.3 Random Variables

Hypergeometric Random Variable - E (X ) - 2nd way

Let X ∼ Hypergeo(N, m, n), what is E (X )?

Chapter 3.1-3.3 Random Variables

Hypergeometric Random Variable - Var (X )

Let X ∼ Hypergeo(N, m, n), what is Var (X )?

Chapter 3.1-3.3 Random Variables

St. Petersburg Lottery

We start with $1 on the table and a coin.

At each step: Toss the coin; if it shows Heads, take the money. If it shows Tails, I double the money on the table.

Let X be the amount you win, what is E (X )?

Sta230/Mth230 (Colin Rundel) Lecture 6 February 5, 2014 24 / 24