Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

MAT 477 - Probability Theory Final Exam, Exams of Mathematics

MAT 477 - Probability Theory Practice Final Exam with solutions

Typology: Exams

2021/2022

Uploaded on 09/28/2022

rafaela-sofia-1
rafaela-sofia-1 🇺🇸

1 document

1 / 14

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Final Exam [Fall 2020 section 03] Solutions
Problem 1A: You randomly choose a password. The password must contain exactly 8 letters
from the set {A, B, C, D }.
1. The probability that at least 1 of the letters appears at least 3 times in the password is
.
P(a letter appears 3 times) = 1 P(all letters appear twice)
= 1 P(all letters appear exactly twice) = 1
8!
(2!)4·
48=7877
8192 = 0.961..
2. Let Xbe the number of times the letter Aappears in the password, and Ythe number of
times the letter Bappears in the password. Find
(a) The distribution of X+Yis (choose between Bernoulli, binomial, Poisson,
geometric, uniform, exponential, normal) with parameter(s) .
(b) Cov (X, Y ) = .
X+YBinom (8,1/2) so Var (X+Y) = 8 ·1
2·1
2= 2. Since Var (X) = Var (Y)=8·1
4·3
4=3
2
Cov (X, Y ) = Var (X+Y)Var (X)Var (Y)
2=1
222·3
2=1
2
3. Assume from now that the password contains exactly 1 A, 2 B’s, 2 C’s and 3 D’s.
(a) The total number of possible passwords is .
(b) The probability that no two D’s appear next to each other is .
(c) We say that all same letters appear together if all letters of a certain type appear first,
then all letters of another type, and so on. For example, in CCBBDDDA same letters
appear together.
i. The probability that all same letters appear together, is .
ii. We find an approximation for the probability that in 100 independent attempts
of picking a password with exactly 1 A, 2 B’s, 2 C’s and 3 D’s, exactly 3 of the
passwords will have all same letters appearing together, assuming that the event
that such a password is randomly chosen is considered a “rare” event.
A. The appropriate approximating random variable is (choose between
Bernoulli, binomial, Poisson, geometric, uniform, exponential, normal) with
parameter(s) .
B. .
The total number of possible passwords is 8
1,2,2,3=8!
2!2!3! = 1680 . There are 1+2+2
1,2,2=5!
1!2!2! = 30
ways to arrange the 5 A, B and C’s. There are then 6 spaces for the 3 D’s, and so 6
3= 20 ways
to arrange the D’s. Therefore the required probability is 30·20
1680 =5
14 = 0.357.
There are 4! ways to order A, B, C, D and to so the probability that same letters appear together
is 4!
1680 =1
70 . The exact probability is given by pX(4) for XBin (100,1/70). We use the Poisson
approximation: take λ=np = 10/7 = 1.428, the probability is 1.4283
3! e1.428 = 0.116
1
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe

Partial preview of the text

Download MAT 477 - Probability Theory Final Exam and more Exams Mathematics in PDF only on Docsity!

Final Exam [Fall 2020 section 03] Solutions

Problem 1A: You randomly choose a password. The password must contain exactly 8 letters from the set {A, B, C, D}.

  1. The probability that at least 1 of the letters appears at least 3 times in the password is .

P(a letter appears ≥ 3 times) = 1 − P(all letters appear ≤ twice)

= 1 − P(all letters appear exactly twice) = 1 −

8! (2!)^4 · 48

  1. Let X be the number of times the letter A appears in the password, and Y the number of times the letter B appears in the password. Find (a) The distribution of X + Y is (choose between Bernoulli, binomial, Poisson, geometric, uniform, exponential, normal) with parameter(s). (b) Cov (X, Y ) =.

X + Y ∼ Binom (8, 1 /2) so Var (X + Y ) = 8 · 12 · 12 = 2. Since Var (X) = Var (Y ) = 8 · 14 · 34 = (^32)

Cov (X, Y ) = Var (X + Y ) − Var (X) − Var (Y ) 2

  1. Assume from now that the password contains exactly 1 A, 2 B’s, 2 C’s and 3 D’s. (a) The total number of possible passwords is. (b) The probability that no two D’s appear next to each other is. (c) We say that all same letters appear together if all letters of a certain type appear first, then all letters of another type, and so on. For example, in CCBBDDDA same letters appear together. i. The probability that all same letters appear together, is. ii. We find an approximation for the probability that in 100 independent attempts of picking a password with exactly 1 A, 2 B’s, 2 C’s and 3 D’s, exactly 3 of the passwords will have all same letters appearing together, assuming that the event that such a password is randomly chosen is considered a “rare” event. A. The appropriate approximating random variable is (choose between Bernoulli, binomial, Poisson, geometric, uniform, exponential, normal) with parameter(s). B..

The total number of possible passwords is

1 , 2 , 2 , 3

= (^) 2!2!3!8! = 1680. There are

1 , 2 , 2

ways to arrange the 5 A, B and C’s. There are then 6 spaces for the 3 D’s, and so

3

= 20 ways to arrange the D’s. Therefore the required probability is 301680 ·^20 = 145 = 0.357. There are 4! ways to order A, B, C, D and to so the probability that same letters appear together is 1680 4! = 701. The exact probability is given by pX (4) for X ∼ Bin (100, 1 /70). We use the Poisson approximation: take λ = np = 10/7 = 1.428, the probability is ≈ 1.^428 3 3! e

Problem 1B: You randomly choose a password. The password must contain exactly 8 letters from the set {A, B, C, D}.

  1. The probability that at least 1 of the letters appears at least 3 times in the password is .

P(a letter appears ≥ 3 times) = 1 − P(all letters appear ≤ twice)

= 1 − P(all letters appear exactly twice) = 1 −

8! (2!)^4 · 48

  1. Let X be the number of times the letter C appears in the password, and Y the number of times the letter D appears in the password. Find (a) The distribution of X + Y is (choose between Bernoulli, binomial, Poisson, geometric, uniform, exponential, normal) with parameter(s). (b) Cov (X, Y ) =.

X + Y ∼ Binom (8, 1 /2) so Var (X + Y ) = 8 · 12 · 12 = 2. Since Var (X) = Var (Y ) = 8 · 14 · 34 = (^32)

Cov (X, Y ) = Var (X + Y ) − Var (X) − Var (Y ) 2

  1. Assume from now that the password contains exactly 1 A, 3 B’s, 1 C’s and 3 D’s. (a) The total number of possible passwords is. (b) The probability that no two B’s appear next to each other is. (c) We say that same letters appear together if all letters of a certain type appear first, then all letters of another type, and so on. For example, in CCCBDDDA same letters appear together. i. The probability that same letters appear together, is. ii. We find an approximation for the probability that in 100 independent attempts of picking a password with exactly 1 A, 3 B’s, 1 C’s and 3 D’s, exactly 3 of the passwords will have same letters appearing together, assuming that the event that such a password is randomly chosen is considered a “rare” event. A. The appropriate approximating random variable is (choose between Bernoulli, binomial, Poisson, geometric, uniform, exponential, normal) with parameter(s). B. The approximated probability is.

The total number of possible passwords is

1 , 3 , 1 , 3

= (^) 3!3!8! = 1120. There are

1 , 3 , 1

ways to arrange the 5 A, D and C’s. There are then 6 spaces for the 3 B’s, and so

3

= 20 ways to arrange the D’s. Therefore the required probability is 201120 ·^20 = 145 = 0.357. There are 4! ways to order A, B, C, D and to so the probability that same letters appear together is 1120 4! = 1403 = 0.021. The exact probability is given by pX (3) for X ∼ Bin (100, 3 /140). We use the Poisson approximation: take λ = np = 15/7 = 2.142, the probability is ≈ 2.^142 3 3! e

Problem 2B: Alice and Bob perform a sequence of independent trials. In each trial, each of them randomly chooses one number from { 1 , 2 , 3 , 4 , 5 }, and the outcome of each trial is the difference between Alice’s number and Bob’s number, that is, if Alice picks 1 and Bob picks 5 then the outcome is | 1 − 5 | = 4.

  1. Compute the following:

(a) The probability that the first outcome is 2 is. (b) The probability that the first outcome is 0 is. (c) The probability that the first outcome is neither 0 nor 2 is.

The outcome is 2 when the chosen numbers are (1, 3) , (2, 4) , (3, 5) , (3, 1) , (4, 2) , (5, 3). There are 25 possibilities for pairs of their picks, therefore P(outcome = 2) = 256. Similarly P(outcome =

  1. = 255 and so P(neither) = (^1425)
  1. The expected value of the first outcome is. Let X be the corresponding r.v., then pX (0) = 255 , pX (1) = 258 , pX (2) = 256 , pX (3) = 254 , pX (4) = 252 E (X) = 0 · 255 + 1 · 258 + 2 · 256 + 3 · 254 + 4 · 252 = 4025 = 1. 6
  2. The expected value of the trial number in which the outcome is 0 for the 4th time is. This is a r.v. ∼ N B

and so the expected value is (^5) /^425 = 20.

  1. The probability that an outcome of 0 appears in the sequence before an outcome of 2 is . Let E be the event that 0 appears before 2, F 1 the event that first outcome is 0, F 2 that first outcome is 2 and F 3 is neither. By LTP

P (E) = P (E | F 1 ) P (F 1 ) + P (E | F 2 ) P (F 2 ) + P (E | F 3 ) P (F 3 ) = 1 ·

+ P(E) ·

where P (E | F 3 ) = P(E) because E and F 3 are independent. Solving for P(E) we get P(E) = 115

Problem 2C: Alice and Bob perform a sequence of independent trials. In each trial, each of them randomly chooses one number from { 1 , 2 , 3 , 4 , 5 }, and the outcome of each trial is the difference between Alice’s number and Bob’s number, that is, if Alice picks 1 and Bob picks 5 then the outcome is | 1 − 5 | = 4.

  1. Compute the following:

(a) The probability that the first outcome is 2 is. (b) The probability that the first outcome is 4 is. (c) The probability that the first outcome is neither 2 nor 4 is.

The outcome is 2 when the chosen numbers are (1, 3) , (2, 4) , (3, 5) , (3, 1) , (4, 2) , (5, 3). There are 25 possibilities for pairs of their picks, therefore P(outcome = 2) = 256. Similarly P(outcome =

  1. = 252 and so P(neither) = (^1725)
  1. The expected value of the first outcome is. Let X be the corresponding r.v., then pX (0) = 255 , pX (1) = 258 , pX (2) = 256 , pX (3) = 254 , pX (4) = 252 E (X) = 0 · 255 + 1 · 258 + 2 · 256 + 3 · 254 + 4 · 252 = 4025 = 1. 6
  2. The expected value of the trial number in which the outcome is 4 for the 4th time is. This is a r.v. ∼ N B

and so the expected value is (^2) /^425 = 50.

  1. The probability that an outcome of 2 appears in the sequence before an outcome of 4 is . Let E be the event that 2 appears before 4, F 1 the event that first outcome is 2, F 2 that first outcome is 4 and F 3 is neither. By LTP

P (E) = P (E | F 1 ) P (F 1 ) + P (E | F 2 ) P (F 2 ) + P (E | F 3 ) P (F 3 ) = 1 ·

+ P(E) ·

where P (E | F 3 ) = P(E) because E and F 3 are independent. Solving for P(E) we get P(E) = 34.

Problem 3B: Let X and Y be continuous random variables with joint density

fX,Y (x, y) =

1 2 ·^ (3y^ +^ x)^ ,^ if^ y^ ≤^2 −^ x^ and^ y^ ≤^ x^ and^ y^ ≥^0 0 , otherwise

  1. The geometric shape of the domain of density is (choose between rectangle, trape- zoid, triangle, parabola, other).
  2. Compute

(a) fY (y) = for ≤ y ≤ and zero otherwise. (b) E (Y ) =. (c) Var (Y ) =.

  1. Let S = Y 1 + · · · + Y 60 be a sum of independent identically distributed random variables Yi, each distributed exactly like the random variable Y found above. We use a famous theorem to approximate P (S > 27).

(a) The approximating random variable is (choose between Bernoulli, binomial, Poisson, geometric, uniform, exponential, normal). (b) The approximation is.

The domain of density is bounded between the lines y = 0, y = x, y = 2 − x, and is thus a triangle. For 0 ≤ y ≤ 1 we have fY (y) = (^12)

∫ (^2) −y x=y 3 y^ +^ x dx^ = 1 + 2y^ −^3 y

E (Y ) =

0

y

1 + 2y − 3 y^2

dy =... =

E

Y 2

0

y^2

1 + 2y − 3 y^2

dy =... =

and Var (Y ) = 307 − 14425 = 72043 = 0.059. By CLT we can use a normal distribution to approximate the required probability:

P (S > 27) = P (Y 1 + · · · + Y 60 > 27) = P

Y 1 + · · · + Y 60 − 60 · 125

≈ 1 − φ

√^2

≈ 1 − φ(1.05) ≈ 0. 1469

Problem 3C: Let X and Y be continuous random variables with joint density

fX,Y (x, y) =

3 10 ·^ (y^ + 3x)^ ,^ if^ y^ ≤^2 −^ x^ and^ y^ ≤^ x^ and^ y^ ≥^0 0 , otherwise

  1. The geometric shape of the domain of density is (choose between rectangle, trape- zoid, triangle, parabola, other).
  2. Compute

(a) fY (y) = for ≤ y ≤ and zero otherwise. (b) E (Y ) =. (c) Var (Y ) =.

  1. Let S = Y 1 + · · · + Y 40 be a sum of independent identically distributed random variables Yi, each distributed exactly like the random variable Y found above. We use a famous theorem to approximate P (S > 16).

(a) The approximating random variable is (choose between Bernoulli, binomial, Poisson, geometric, uniform, exponential, normal). (b) The approximation is.

The domain of density is bounded between the lines y = 0, y = x, y = 2 − x, and is thus a triangle. For 0 ≤ y ≤ 1 we have fY (y) = 103

∫ (^2) −y x=y y^ + 3x dx^ =^

3 5 (3^ −^2 y^ −^ y

E (Y ) =

0

y

3 − 2 y − y^2

dy =... =

E

Y 2

0

y^2

3 − 2 y − y^2

dy =... =

and Var (Y ) = 509 − 40049 = 40023 = 0.0575. By CLT we can use a normal distribution to approximate the required probability:

P (S > 16) = P (Y 1 + · · · + Y 40 > 16) = P

Y 1 + · · · + Y 40 − 40 · 207

≈ 1 − φ

√^2

≈ 1 − φ(1.32) ≈ 0. 0934

Problem 4B: A tree has a 2 meter long branch. An apple grows on a random point X on the branch, and independently, a bird lands on a random point Y on the branch. The joint density function of X and Y is given by

fX,Y (x, y) =

c · (xy + 2) , 0 ≤ x, y ≤ 2 0 , otherwise

  1. Compute the following: (a) c =. (b) P (Y < X) =. (c) fY |X=1 (y) = for ≤ y ≤ and zero otherwise. (d) P (Y > 1 |X = 1) =.

To compute the constant c we solve

1 = c

0

0

xy + 2 dxdy = c

0

2 y + 8 dy = 12c

so c = 121. We solve

P (Y < X) =

0

∫ (^) x

0

xy + 2 dydx =

0

x x^2 2

  • 2x dx

+ 2^2

fY |X=1 (y) = fY |X (y | 1) = fX,Y (1, y) fX (1)

c (y + 2) c

y=0 y^ + 2^ dy^

y + 2 6

P (Y > 1 |X = 1) = P (Y |X=1 > 1) =

1

y + 2 6 dy =

  1. In order to compute the expected value of the distance between the bird and the apple we set up a double integral, according to the following information: (a) The function inside this double integral (in terms of x and y) is. (b) If dy is the interior integral and dx the exterior integral, the integration limits are i. x from to. ii. y from to.

E (|X − Y |) =

0

0

|x − y| xy + 2 12

dxdy

Problem 4C: A tree has a 2 meter long branch. An apple grows on a random point X on the branch, and independently, a bird lands on a random point Y on the branch. The joint density function of X and Y is given by

fX,Y (x, y) =

c · (xy + 3) , 0 ≤ x, y ≤ 2 0 , otherwise

  1. Compute the following: (a) c =. (b) P (Y < X) =. (c) fY |X=1 (y) = for ≤ y ≤ and zero otherwise. (d) P (Y > 1 |X = 1) =.

To compute the constant c we solve

1 = c

0

0

xy + 3 dxdy = c

0

2 y + 6 dy = 16c

so c = 161. We solve

P (Y < X) =

0

∫ (^) x

0

xy + 3 dydx =

0

x x^2 2

  • 3x dx

fY |X=1 (y) = fY |X (y | 1) = fX,Y (1, y) fX (1)

c (y + 3) c

y=0 y^ + 3^ dy^

y + 3 8

P (Y > 1 |X = 1) = P (Y |X=1 > 1) =

1

y + 3 8 dy =

  1. In order to compute the expected value of the distance between the bird and the apple we set up a double integral, according to the following information: (a) The function inside this double integral (in terms of x and y) is. (b) If dy is the interior integral and dx the exterior integral, the integration limits are i. x from to. ii. y from to.

E (|X − Y |) =

0

0

|x − y| xy + 3 16

dxdy

Problem 5B: Let X > 0 be a positive random variable, with E (X) = 8 and Var (X) = 2.

  1. Find the following:

(a) According to Markov P (X ≥ 16) ≤. (b) According to Chebyshev P (6 < X < 10) ≥.

By Markov P (X ≥ 16) ≤

E (X)

By Chebyshev P (6 < X < 10) = 1 − P (|X − 8 | ≥ 2) ≥ 1 −

  1. Assume that Y is a random variable so that Y |X ∼ exp

X

. We want to use the law of total expectation to find Cov (X, Y ).

(a) E (Y |X) =. (b) E (Y ) =. (c) E (XY ) =. (d) Cov (X, Y ) =.

Since Y |X=x ∼ exp

x

, we have that E (Y | X = x) = (^11) x = x. Thus, E (Y |X) = X. By LTE,

E (Y ) = E (E (Y | X)) = E (X) = 8.

To find Cov (X, Y ) we first compute E (XY ) using the same method:

E (XY ) = E (E (XY |X)) = E (XE (Y |X))

= E

X^2

= Var (X) + (E (X))^2 = 2 + 8^2

and so Cov (X, Y ) = 2 + 8^2 − 82 = 2

Problem 5C: Let X > 0 be a positive random variable, with E (X) = 12 and Var (X) = 3.

  1. Find the following:

(a) According to Markov P (X ≥ 16) ≤. (b) According to Chebyshev P (9 < X < 15) ≥.

By Markov P (X ≥ 16) ≤

E (X)

By Chebyshev P (9 < X < 15) = 1 − P (|X − 12 | ≥ 3) ≥ 1 −

  1. Assume that Y is a random variable so that Y |X ∼ exp

X

. We want to use the law of total expectation to find Cov (X, Y ).

(a) E (Y |X) =. (b) E (Y ) =. (c) E (XY ) =. (d) Cov (X, Y ) =.

Since Y |X=x ∼ exp

x

, we have that E (Y | X = x) = (^11) x = x. Thus, E (Y |X) = X. By LTE,

E (Y ) = E (E (Y | X)) = E (X) = 12.

To find Cov (X, Y ) we first compute E (XY ) using the same method:

E (XY ) = E (E (XY |X)) = E (XE (Y |X))

= E

X^2

= Var (X) + (E (X))^2 = 3 + 12^2

and so Cov (X, Y ) = 3 + 12^2 − 122 = 3