








Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
MAT 477 - Probability Theory Practice Final Exam with solutions
Typology: Exams
1 / 14
This page cannot be seen from the preview
Don't miss anything!
Problem 1A: You randomly choose a password. The password must contain exactly 8 letters from the set {A, B, C, D}.
P(a letter appears ≥ 3 times) = 1 − P(all letters appear ≤ twice)
= 1 − P(all letters appear exactly twice) = 1 −
8! (2!)^4 · 48
X + Y ∼ Binom (8, 1 /2) so Var (X + Y ) = 8 · 12 · 12 = 2. Since Var (X) = Var (Y ) = 8 · 14 · 34 = (^32)
Cov (X, Y ) = Var (X + Y ) − Var (X) − Var (Y ) 2
The total number of possible passwords is
1 , 2 , 2 , 3
= (^) 2!2!3!8! = 1680. There are
1 , 2 , 2
ways to arrange the 5 A, B and C’s. There are then 6 spaces for the 3 D’s, and so
3
= 20 ways to arrange the D’s. Therefore the required probability is 301680 ·^20 = 145 = 0.357. There are 4! ways to order A, B, C, D and to so the probability that same letters appear together is 1680 4! = 701. The exact probability is given by pX (4) for X ∼ Bin (100, 1 /70). We use the Poisson approximation: take λ = np = 10/7 = 1.428, the probability is ≈ 1.^428 3 3! e
Problem 1B: You randomly choose a password. The password must contain exactly 8 letters from the set {A, B, C, D}.
P(a letter appears ≥ 3 times) = 1 − P(all letters appear ≤ twice)
= 1 − P(all letters appear exactly twice) = 1 −
8! (2!)^4 · 48
X + Y ∼ Binom (8, 1 /2) so Var (X + Y ) = 8 · 12 · 12 = 2. Since Var (X) = Var (Y ) = 8 · 14 · 34 = (^32)
Cov (X, Y ) = Var (X + Y ) − Var (X) − Var (Y ) 2
The total number of possible passwords is
1 , 3 , 1 , 3
= (^) 3!3!8! = 1120. There are
1 , 3 , 1
ways to arrange the 5 A, D and C’s. There are then 6 spaces for the 3 B’s, and so
3
= 20 ways to arrange the D’s. Therefore the required probability is 201120 ·^20 = 145 = 0.357. There are 4! ways to order A, B, C, D and to so the probability that same letters appear together is 1120 4! = 1403 = 0.021. The exact probability is given by pX (3) for X ∼ Bin (100, 3 /140). We use the Poisson approximation: take λ = np = 15/7 = 2.142, the probability is ≈ 2.^142 3 3! e
Problem 2B: Alice and Bob perform a sequence of independent trials. In each trial, each of them randomly chooses one number from { 1 , 2 , 3 , 4 , 5 }, and the outcome of each trial is the difference between Alice’s number and Bob’s number, that is, if Alice picks 1 and Bob picks 5 then the outcome is | 1 − 5 | = 4.
(a) The probability that the first outcome is 2 is. (b) The probability that the first outcome is 0 is. (c) The probability that the first outcome is neither 0 nor 2 is.
The outcome is 2 when the chosen numbers are (1, 3) , (2, 4) , (3, 5) , (3, 1) , (4, 2) , (5, 3). There are 25 possibilities for pairs of their picks, therefore P(outcome = 2) = 256. Similarly P(outcome =
and so the expected value is (^5) /^425 = 20.
P (E) = P (E | F 1 ) P (F 1 ) + P (E | F 2 ) P (F 2 ) + P (E | F 3 ) P (F 3 ) = 1 ·
where P (E | F 3 ) = P(E) because E and F 3 are independent. Solving for P(E) we get P(E) = 115
Problem 2C: Alice and Bob perform a sequence of independent trials. In each trial, each of them randomly chooses one number from { 1 , 2 , 3 , 4 , 5 }, and the outcome of each trial is the difference between Alice’s number and Bob’s number, that is, if Alice picks 1 and Bob picks 5 then the outcome is | 1 − 5 | = 4.
(a) The probability that the first outcome is 2 is. (b) The probability that the first outcome is 4 is. (c) The probability that the first outcome is neither 2 nor 4 is.
The outcome is 2 when the chosen numbers are (1, 3) , (2, 4) , (3, 5) , (3, 1) , (4, 2) , (5, 3). There are 25 possibilities for pairs of their picks, therefore P(outcome = 2) = 256. Similarly P(outcome =
and so the expected value is (^2) /^425 = 50.
P (E) = P (E | F 1 ) P (F 1 ) + P (E | F 2 ) P (F 2 ) + P (E | F 3 ) P (F 3 ) = 1 ·
where P (E | F 3 ) = P(E) because E and F 3 are independent. Solving for P(E) we get P(E) = 34.
Problem 3B: Let X and Y be continuous random variables with joint density
fX,Y (x, y) =
1 2 ·^ (3y^ +^ x)^ ,^ if^ y^ ≤^2 −^ x^ and^ y^ ≤^ x^ and^ y^ ≥^0 0 , otherwise
(a) fY (y) = for ≤ y ≤ and zero otherwise. (b) E (Y ) =. (c) Var (Y ) =.
(a) The approximating random variable is (choose between Bernoulli, binomial, Poisson, geometric, uniform, exponential, normal). (b) The approximation is.
The domain of density is bounded between the lines y = 0, y = x, y = 2 − x, and is thus a triangle. For 0 ≤ y ≤ 1 we have fY (y) = (^12)
∫ (^2) −y x=y 3 y^ +^ x dx^ = 1 + 2y^ −^3 y
0
y
1 + 2y − 3 y^2
dy =... =
0
y^2
1 + 2y − 3 y^2
dy =... =
and Var (Y ) = 307 − 14425 = 72043 = 0.059. By CLT we can use a normal distribution to approximate the required probability:
≈ 1 − φ
≈ 1 − φ(1.05) ≈ 0. 1469
Problem 3C: Let X and Y be continuous random variables with joint density
fX,Y (x, y) =
3 10 ·^ (y^ + 3x)^ ,^ if^ y^ ≤^2 −^ x^ and^ y^ ≤^ x^ and^ y^ ≥^0 0 , otherwise
(a) fY (y) = for ≤ y ≤ and zero otherwise. (b) E (Y ) =. (c) Var (Y ) =.
(a) The approximating random variable is (choose between Bernoulli, binomial, Poisson, geometric, uniform, exponential, normal). (b) The approximation is.
The domain of density is bounded between the lines y = 0, y = x, y = 2 − x, and is thus a triangle. For 0 ≤ y ≤ 1 we have fY (y) = 103
∫ (^2) −y x=y y^ + 3x dx^ =^
3 5 (3^ −^2 y^ −^ y
0
y
3 − 2 y − y^2
dy =... =
0
y^2
3 − 2 y − y^2
dy =... =
and Var (Y ) = 509 − 40049 = 40023 = 0.0575. By CLT we can use a normal distribution to approximate the required probability:
≈ 1 − φ
≈ 1 − φ(1.32) ≈ 0. 0934
Problem 4B: A tree has a 2 meter long branch. An apple grows on a random point X on the branch, and independently, a bird lands on a random point Y on the branch. The joint density function of X and Y is given by
fX,Y (x, y) =
c · (xy + 2) , 0 ≤ x, y ≤ 2 0 , otherwise
To compute the constant c we solve
1 = c
0
0
xy + 2 dxdy = c
0
2 y + 8 dy = 12c
so c = 121. We solve
0
∫ (^) x
0
xy + 2 dydx =
0
x x^2 2
fY |X=1 (y) = fY |X (y | 1) = fX,Y (1, y) fX (1)
c (y + 2) c
y=0 y^ + 2^ dy^
y + 2 6
1
y + 2 6 dy =
0
0
|x − y| xy + 2 12
dxdy
Problem 4C: A tree has a 2 meter long branch. An apple grows on a random point X on the branch, and independently, a bird lands on a random point Y on the branch. The joint density function of X and Y is given by
fX,Y (x, y) =
c · (xy + 3) , 0 ≤ x, y ≤ 2 0 , otherwise
To compute the constant c we solve
1 = c
0
0
xy + 3 dxdy = c
0
2 y + 6 dy = 16c
so c = 161. We solve
0
∫ (^) x
0
xy + 3 dydx =
0
x x^2 2
fY |X=1 (y) = fY |X (y | 1) = fX,Y (1, y) fX (1)
c (y + 3) c
y=0 y^ + 3^ dy^
y + 3 8
1
y + 3 8 dy =
0
0
|x − y| xy + 3 16
dxdy
Problem 5B: Let X > 0 be a positive random variable, with E (X) = 8 and Var (X) = 2.
(a) According to Markov P (X ≥ 16) ≤. (b) According to Chebyshev P (6 < X < 10) ≥.
By Markov P (X ≥ 16) ≤
By Chebyshev P (6 < X < 10) = 1 − P (|X − 8 | ≥ 2) ≥ 1 −
X
. We want to use the law of total expectation to find Cov (X, Y ).
(a) E (Y |X) =. (b) E (Y ) =. (c) E (XY ) =. (d) Cov (X, Y ) =.
Since Y |X=x ∼ exp
x
, we have that E (Y | X = x) = (^11) x = x. Thus, E (Y |X) = X. By LTE,
E (Y ) = E (E (Y | X)) = E (X) = 8.
To find Cov (X, Y ) we first compute E (XY ) using the same method:
= Var (X) + (E (X))^2 = 2 + 8^2
and so Cov (X, Y ) = 2 + 8^2 − 82 = 2
Problem 5C: Let X > 0 be a positive random variable, with E (X) = 12 and Var (X) = 3.
(a) According to Markov P (X ≥ 16) ≤. (b) According to Chebyshev P (9 < X < 15) ≥.
By Markov P (X ≥ 16) ≤
By Chebyshev P (9 < X < 15) = 1 − P (|X − 12 | ≥ 3) ≥ 1 −
X
. We want to use the law of total expectation to find Cov (X, Y ).
(a) E (Y |X) =. (b) E (Y ) =. (c) E (XY ) =. (d) Cov (X, Y ) =.
Since Y |X=x ∼ exp
x
, we have that E (Y | X = x) = (^11) x = x. Thus, E (Y |X) = X. By LTE,
E (Y ) = E (E (Y | X)) = E (X) = 12.
To find Cov (X, Y ) we first compute E (XY ) using the same method:
= Var (X) + (E (X))^2 = 3 + 12^2
and so Cov (X, Y ) = 3 + 12^2 − 122 = 3