Conditional probibility for 3 events

Screen Link:

I’ve mostly followed the Probability course up to this point, but can’t wrap my head around screens 7 and 8 of Conditional Probability: Intermediate.

I’m fine up until it says “However, multiplying P(A), P(B), P(C) together gives us a different result:”
P(A)⋅P(B)⋅P(C)=12⋅12⋅12=18P(A∩B∩C)=14≠P(A)⋅P(B)⋅P(C)

The math makes sense (I think), but I don’t understand how this is the case. How these events are dependent. Maybe I don’t fully understand the events and how they interact. The example seems a little convoluted to me. Can someone explain?

Hi Patrick,

This example is meant to illustrate this principle: a collection of events being pairwise independent does not imply that it is an independent collection of events.

You’ve said you’re OK with the math parts / calculations, so I’ll try to focus on the intuition here. The intuition of independence is that two things are independent when knowing that one thing has (or hasn’t) happened conveys no information about whether the other has happened.

  • Events A and B are independent because if we know that A has occurred (and we know nothing else), this gives us no useful information about whether B has occurred. (Recall that A indicates that the first coin was heads, and B indicates that the second coin was heads, so this should make sense.)
  • Events A and C are independent as well. If we know that the first coin is heads, this gives us no information about whether C has occurred (i.e. that the two coins match).
  • Similarly, B and C are independent; if you tell me that the second coin is heads, I still don’t know whether the two coins match.

However, the group is not an independent collection of events. If you tell me that both A and B are true – that is, that the first coin is heads, and so is the second – then that does give me information about whether C is true, which it would necessarily be in this example.

What it would mean to have an independent collection of events A, B, C is that you could give me any information about 1 or 2 of those variables, and I still would have no useable knowledge about the other(s). The canonical example of an independent collection of three events would be if I flip three coins and let A, B, and C keep track of whether the first, second, and third coins were heads (respectively). Note that it would be a bit more obnoxious to write down these events explicitly (meaning, write them as subsets of the sample space) since the sample space is larger in this example.

Thanks, @amontgom . That was the missing piece I needed. I wasn’t considering that if you knew 2 it gave you information about the 3rd. CLICK!

1 Like

Hi and thanks for your answer @amontgom. Could you please tell me where I’m wrong ?

Events A and B are independent because if we know that A has occurred (and we know nothing else), this gives us no useful information about whether B has occurred. (Recall that A indicates that the first coin was heads, and B indicates that the second coin was heads, so this should make sense.)

If we know that event A has occurred , then we know that the result of the experiment is in {HH, HT}.
B event set is {HH, TH}.

Then if we know that A has occurred , we can remove TH from B’s event set as it’s not in A, right ?
Then P(B|A) is not the same as P(B) in term of “outcome space”?

Is it because P(B) is 2/4 in the original space and P(B|A) would be 1/2 (because we reduce the space to A’s one) and 1/2 = 2/4. Do we only consider the resulting probabilities to define independence ?

thx

Exactly right. While you’re correct that the sample space changes when conditioning on A, the important part is whether the probabilities change. Specifically, the definition of independence of A and B is (or rather, is equivalent to) the equation \mathbb P(B \mid A) = \mathbb P(B). The probabilities are indeed the only thing that matter for independence.

1 Like