Download Chapter 2 Homework (chapter2-homework.docx)
Download Chapter 2 Homework Set 1 Solution (chapter2-homework-solution1.html)
Download Chapter 2 Homework Set 2 Solution (chapter2-homework-solution1.html)
The union A ∪ B
means A occurs, or B occurs, or both occur.
The intersection A ∩ B
means both A and B occur together.
Always 1
, because some outcome must happen in every trial.
It is all outcomes in the sample space that are not in A, written as A′
.
If A and B are independent, then P(A ∩ B) = P(A) × P(B)
.
The probability of A given that B has occurred: P(A|B) = P(A ∩ B) / P(B)
.
It allows us to update the probability of a cause (event) when new evidence is observed.
Permutation cares about order of selection; Combination does not.
1 − P(no heads) = 1 − (1/2 × 1/2) = 3/4
.
They help model uncertainty in real systems, reliability analysis, and risk management.
P(A∩B) = P(A)P(B)
(equivalently, P(A|B)=P(A)
and P(B|A)=P(B)
). Independent events often do overlap.P(A∩B) = 0
. If P(A)>0
and P(B)>0
, then they are automatically not independent because P(A)P(B)>0 ≠ 0
.Examples: One coin toss—A: “Heads”, B: “Tails” ⇒ mutually exclusive (and dependent). Two coins—A: “1st is Heads”, B: “2nd is Heads” ⇒ independent.
What it does: Bayes converts likelihoods into an updated belief (posterior): P(A|B) = [P(B|A)·P(A)] / P(B)
. It “turns around” conditional probability to answer “How likely is the cause given what I saw?”
Why important: It’s the engine behind diagnostic testing, spam filtering, reliability and fault attribution, sensor fusion, and many ML workflows where you update prior knowledge with new evidence.
Quick example (classroom pepper): Let A = “Foreign”, B = “Likes pepper”. Assume P(A)=0.20
, P(B|A)=0.80
, P(B|A′)=0.30
. Then P(B)=0.80·0.20 + 0.30·0.80 = 0.40
. Thus P(A|B) = (0.80·0.20)/0.40 = 0.40
. Interpretation: among pepper-lovers, 40% are Foreign, 60% USA.