Project 1.1: Probability
Due 2/7 at 11:59pm: Submit your assignment in a .pdf or a .txt file. You may submit one assignment per person, as always,
though we recommend everyone to work through this assignment with others and writing up solutions themselves. You may additionally hand your HW to your section.
Exercises from the book, 2 points each (6 points): 13.8, 14.3, 14.4
Question 1 (2 points).
Consider the following 3-sided dice with the given side values. Assume the dice are all fair
and all rolls are independent.
A: 2, 2, 5
B: 1, 4, 4
C: 3, 3, 3
- What is the expected value of each die?
- Consider the indicator function better(X,Y) which has value 1 if X>Y and
value -1 if X<Y. What are the expected values of better(A, B), better(B, C), better(C, A)? Why
are these sometimes called non-transitive dice?
Question 2 (2 points). Assume that a joint distribution
over two variables, X = {x, ¬x}and Y = {y, ¬y} is known to have the marginal distributions
P(x) = P(¬x) = P(y) = P(¬y). Give joint distributions satisfying these marginals for
each of these conditions:
- X and Y are independent
- Observing Y=y increases the belief in X=x, i.e. P(x | y) > P(x)
- Observing Y=y decreases the belief in X=x, i.e. P(x | y) < P(x)
Question 3 (2 points). On a day when an assignment
is due (A=a), the newsgroup tends to be busy (B=b), and the computer lab tends
to be full (C=c). Consider the following conditional probability tables
for the domain, where A = {a, ¬a}, B = {b, ¬b}, C = {c, ¬c}.
P(A) |
P(B|A) |
P(C|A) |
|
B |
A |
P |
b |
a |
0.90 |
¬b |
a |
0.10 |
b |
¬a |
0.40 |
¬b |
¬a |
0.60 |
|
C |
A |
P |
c |
a |
0.70 |
¬c |
a |
0.30 |
c |
¬a |
0.50 |
¬c |
¬a |
0.50 |
|
- Construct the joint distribution out of these conditional probabilities
tables assuming B and C are independent given A.
- What is the marginal distribution P(B,C)? Are these two variables
absolutely independent in this model? Justify your answer using the
actual probabilities, not your intuitions.
- What is the posterior distribution over A given that B=b, P(A |
B=b)? What is the posterior distribution over A given that C=c, P(A |
C=c)? What about P(A | B=b, C=c)? Explain the pattern among
these posteriors and why it holds.
Question 4 (2 points). Sometimes, there is traffic
(cars) on the freeway (C=c). This could either be because of a ball game (B=b) or
because of an accident (A=a). Consider the following joint probability
table for the domain, where A = {a, ¬a}, B = {b, ¬b}, C = {c, ¬c}.
P(A, B, C) |
A |
B |
C |
P |
a |
b |
c |
0.018 |
a |
b |
¬c |
0.002 |
a |
¬b |
c |
0.126 |
a |
¬b |
¬c |
0.054 |
¬a |
b |
c |
0.064 |
¬a |
b |
¬c |
0.016 |
¬a |
¬b |
c |
0.072 |
¬a |
¬b |
¬c |
0.648 |
|
- What is the distribution P(A,B)? Are A and B independent in this
model given no evidence? Justify your answer using the actual
probabilities, not your intuitions.
- What is the marginal distribution over A given no evidence?
- How does this change if we observe that C=c; what is the posterior
distribution P(A | C=c)? Does this change intuitively make
sense? Why or why not?
- What is the conditional distribution over A if we then learn there is a
ball game, P(A | B=b, C=c)? Does it make sense that observing B should
cause this update to A (called explaining-away)? Why or why not?
Question 5 (2 points). Often we need to carry out
reasoning over some pair of variables X, Y conditioned on the value of other
variable E.
- Using the definitions of conditional probabilities, prove the
conditionalized version of the product rule: P(x, y | e) = P(x | y, e) P(y |
e)
- Prove the conditionalized version of Bayes' rule: P(y | x, e) = P(x | y,
e) P(y | e) / P(x | e)
Question 6 (2 points). Suppose we wish to calculate
P(C=c | A=a, B=b).
- If we have no conditional independence information, which of the following
sets of tables are sufficient to calculate P(C=c | A=a, B=b)?
- P(A, B), P(C), P(A | C), P(B | C)
- P(A, B), P(C), P(A, B | C)
- P(A, B, C)
- P(C), P(A| C), P(B | C)
- P(C | A, B), P(A)
- Which are sufficient if we know that A and B are conditionally independent
given C?