MATH-UA 233/ MA-UY 3014
Theory of Probability/ Applied Probability
Summer 2024
Final: Sample Problems
The final will likely have 6-8 questions. The questions marked with ∗ are considered more difficult than the rest.
1. Find n, m ∈ N for which
2. Find n ∈ N, n ≥ 119, for which
3. For n ∈ N and x, y, z ∈ R, z > 0, find a closed formula for
Hint: Show the product of the first two factors is a binomial coefficient.
4. Suppose x, y > 0, with
Find x, y assuming that x − y = 6.
5. Suppose X is a random variable with E[X2
] < ∞. Find a ∈ R minimizing
E[(X − 2a)
2 + (1 − aX)
2
].
That is, find a such that for all b ∈ R,
E[(X − 2b)
2 + (1 − bX)
2
] ≥ E[(X − 2a)
2 + (1 − aX)
2
].
6. Find r ∈ R for which there exists a probability P on the sample space S = Z satisfying
P({n}) = 10 · r
3n
, P({−n}) = 2−n−4
,
for all n ∈ N.
7. Independent events A, B, C satisfy P(A) = 2/1, P(A ∪ B ∪ C) = 18/13, P(B) + 2P(C) = 108/29. Find P(A ∩ B ∩ C
c
).
8. A roulette is spun 300 times: its outcomes (the spins are independent) are integers between 1 and 400 (inclusively) and are denoted by Sk, 1 ≤ k ≤ 300. Let E be the event that S2k−1 + S2k = 100 + k for 1 ≤ k ≤ 100, and F the event that S3k−1 − S3k = 100 + k for 1 ≤ k ≤ 100. Which event has larger probability, E or F?
9(∗). Suppose X, Y are independent random variables with X ≥ 0, X Y, and E[X] = 1. Show
E[e
XY ] ≥ E[e
X].
10. Suppose X has moment generating function
Find the moment generating function of X2
(a power series decomposition suffices).
11. A random variable X is called symmetric if X −X.
(a) Suppose X is discrete: find the necessary and sufficient conditions for X to be symmetric in terms of its probability mass distribution.
(b) Find all discrete random variables X ∈ Z for which both X, X2 − 9 are symmetric.
(c) Show there is no random variable X (not necessarily discrete) such that X2 − 4X + 5 is symmetric.
12. Suppose Y = X3 + 4, and X is an exponential random variable with parameter 2 : i.e.,
p(x) = 2e
−2x
, x > 0
is the pdf of X. Compute the joint cumulative distribution of X and Y.
13(∗). Suppose X and Y = X3 − 4X are independent with E[X] = 0, E[X4
] < ∞. Show E[X2
] ≤ 4, and E[X4
] ≤ 16.
14. Suppose X > 0, Y = log X, and
Find V ar(X), and Corr(X, X2
).
15(∗). Suppose X takes finitely many values, and
Find the probability mass distribution of X.
16. Suppose X > 0, and
Show P(X ≤ 100 + ϵ) = 1 for any fixed ϵ > 0, while P(X ≥ 12) < 1.
17. Find an, bn ∈ R such that
18. Give an example of a sequence Xn with V ar(Xn) = 1, E[Xn] = 0, |Corr(Xi
, Xj )| ≤ 2/1 for all i ≠ j, for which
does not hold.
Hint: Consider Xk = αkX + Yk for carefully chosen X, αk, Yk.
19(∗). Let Xn be a sequence of independent random variables with P(Xn = 1) = P(Xn = n) = 2/1. Prove
does not hold.
Hint: For a given N, compute P(Xk < k, ∀k ≥ N).
20. Suppose X, Y have joint probability density
Show X and Y are not independent.
21. Suppose f, g : R → R are nondecreasing functions, and X, Y are independent real-valued random variables with the same cumulative distribution function.
(a) Show
(f(X) − f(Y ))(g(X) − g(Y )) ≥ 0.
(b) Show
Cov(f(X), g(X)) ≥ 0.
22. Suppose X, Y, Z are random variables and
V ar(X) = V ar(Y ) = V ar(Z) = 1.
(a) By using V ar(aX + bY + cZ) ≥ 0, show
6Cov(X, Y ) + 15Cov(Y, Z) + 10Cov(Z, X) ≥ −19.
(b) Assume equality occurs in (a). Compute
Corr(6X + 3Y + 5Z, 3X).
23. Suppose X is a nonnegative random variable with
E[Xk
] ≤ 2
k
k
3
for all k ∈ N, and fix ϵ > 0.
(a) Show that for any increasing function h : [0, ∞) → R and a > 0,
P(X ≥ a) = P(h(X) ≥ h(a)).
(b) Using Markov’s inequality and part (a), show
P(X < 2 + ϵ) = 1.
24. Suppose X1, X2, ... , Xn are random variables with
Cov(Xi
, Xj + Xi) = 3i + j, 1 ≤ i ≤ j ≤ n.
(a) Find for all 1 ≤ i ≤ j ≤ n,
Cov(Xi
, Xj ).
(b) Suppose X1, X2, ... , Xn have mean 1. Show that for any a > 0,
25. The joint density of X and Y is given by
(a) Compute the joint moment generating function of X and Y.
(b) Compute the individual moment generating functions of X and Y.
26. The moment generating functions of X and Y are
Suppose X, Y are independent. Find
E[(X + Y )
2
].
27. Suppose the moment generating function of X is
MX(t) = e
(t+1)3−2t+8−a
.
(a) Find a.
(b) Suppose Y is independent from X, and X + Y is a normal random variable with mean 0 and variance 2. Find the moment generating function of Y.
28. Suppose X1, X2, ... , Xn are i.i.d. random variables with probability density function
Find some a, b ∈ R for which
29. Suppose X1, X2, ... , Xn, ... are random variables with mean 0, and
for all i, j ≥ 1. Prove
30. A fair die is rolled 100 times and Ri
is the value of the top face of the i
th roll. Compute an approximation of
P(R1R2...R100 ≥ a
100)
for a > 1 using the central limit theorem.
31. Let Rn be the top face of the i
th roll of a fair die. For n ≥ 100, Let
Xn = R1R2...R100 + R2R3...R101 + ... + Rn−99Rn−98...Rn.
(a) Find E[Xn].
(b) Show
Hint: Chebyshev’s inequality.
32. Find a, b ∈ R for which there is a random variable X with probability density function f given by
f(x) = ax + b, x ∈ [0, 2],
and
33. Find a, b ∈ R for which there exist probability density functions f, g given by
f(x) = ax + b, x ∈ [0, 2],
g(x) = 2bx + 2a, x ∈ [0, 1].
34. Suppose X has probability density function
Find a ∈ R with
25. Suppose X, Y are continuous random variables with probability densities f, g, and
Find
E[3Y + 8].
36. Suppose X, Y have joint probability density function
f(x, y) = x + y, 0 ≤ x, y ≤ 1.
Find E[XY ].
37. Suppose X, Y have joint probability density function
f(x, y) = cye−x + e
−2x
, 0 ≤ y ≤ 1, x > 0.
Find fX, fY , the marginal probability densities of X, Y.
38(∗). Suppose X is a continuous random variable with probability density
f(x) = e
−a(x−t)
2 , x ∈ R.
Find t ∈ R that minimizes V ar((t + 1)X + 1) + V ar(tX) i.e., compute t0 ∈ R such that
V ar((s + 1)Y + 1) + V ar(sY ) ≥ V ar((t0 + 1)X + 1) + V ar(t0X)
for all s ∈ R with Y having probability density
g(x) = e
−a(x−s)
2 , x ∈ R.
39. Suppose X, Y have joint probability density function
Show X, Y are not independent.
40. Suppose X, Y are independent random variables with variance 1. Find a ∈ R such that
41. Show there are no independent random variables X, Y, Z with
E[XY ] = 10, E[Y Z2
] + Cov(X, 2Z + 3) = −2, E[Z
2X] + Cov(X + 2Y, 2Z + 3) = 10.
42. Suppose X, Y are jointly continuous random variables with joint probability density function
f(x, y) = e
−x−2y + e
−y/4−8x
, x, y > 0.
Find a with fX|Y =a(0) = 2.
43. Suppose X, Y are independent, and so are X, X + Y. Show E[X2
] = (E[X])2
, and find
Cor(X, X2 + 4X − 2).
44. Suppose X, Y have Cov(X, 2Y ) = 10, V ar(X) = 2, V ar(X + Y ) = 18. Find Corr(X − 10, 2Y + 4.)
45. Find a ∈ R such that the random variable X with moment generating function
has E[X2
] = 5.
46. Suppose Xn are i.i.d. with X1 X, E[X] = 1, E[X2
] = 2. Show
47. Suppose X, Y are random variables with
Show X, Y are normal random variables, and find their means and variances.
48. Suppose Xn are i.i.d. with X1 N(0, 1), and let an be a sequence of real numbers with P n≥1
a
2
n = 6. Find c ∈ R with
c(a1X1 + ... + anXn) ⇒ N(0, 54).
Hint: The sums above are normal random variables.
49. Suppose X, Y are random variables with
Find c ∈ R such that
for Xn i.i.d., and Yn i.i.d. with X1 X, Y1 Y.
50. Suppose Xn are i.i.d. with X1 X, V ar(X) = 2, E[X4
] = 100. Find a > 0 such that
51. Suppose X is a standard normal random variable and Y = a + bX + cX2
. Show
52. Suppose X, Y are independent uniform. random variables on [0, 1], [3, 4], respectively.
(a) Find
Cov(X + Y, X − 5).
(b) Find a, b ∈ R satisfying