5.2 Probability
Detailed Theory: Probability
1. Basic Concepts and Terminology
1.1 What is Probability?
Probability is a branch of mathematics that deals with calculating the likelihood of events occurring. It quantifies uncertainty.
Key Idea: Probability is a number between 0 and 1 that represents how likely an event is to occur.
0 means the event is impossible
1 means the event is certain
0.5 means the event is equally likely to occur or not occur
1.2 Important Terms
a) Experiment
An action or process that leads to one of several possible outcomes.
Examples:
Tossing a coin
Rolling a die
Drawing a card from a deck
b) Sample Space (S)
The set of all possible outcomes of an experiment.
Examples:
Tossing a coin: S={H,T} (H = Head, T = Tail)
Rolling a die: S={1,2,3,4,5,6}
Tossing two coins: S={HH,HT,TH,TT}
c) Event (E)
A subset of the sample space (one or more outcomes).
Examples for die roll:
Event A: Getting an even number = {2,4,6}
Event B: Getting a number > 4 = {5,6}
d) Types of Events
Simple Event: Single outcome (e.g., getting a 3 on die)
Compound Event: Multiple outcomes (e.g., getting even number)
Impossible Event: Event that cannot occur (probability = 0)
Sure/Certain Event: Event that always occurs (probability = 1)
Complementary Event (E' or E⁰): All outcomes NOT in E
2. Different Approaches to Probability
2.1 Classical (Theoretical) Probability
Used when all outcomes are equally likely.
Formula:
P(E)=Total number of outcomesNumber of favorable outcomes=n(S)n(E)
Example: Probability of getting an even number when rolling a die:
S={1,2,3,4,5,6}⇒n(S)=6
E={2,4,6}⇒n(E)=3
P(E)=63=21=0.5
2.2 Empirical (Experimental) Probability
Based on actual experiments or observations.
Formula:
P(E)=Total number of trialsNumber of times event occurred
Example: In 100 coin tosses, heads appeared 47 times.
Empirical probability of heads = 10047=0.47
2.3 Axiomatic Probability
Based on three axioms (rules):
Axiom 1: 0≤P(E)≤1 for any event E
Axiom 2: P(S)=1 (probability of sample space is 1)
Axiom 3: For mutually exclusive events A and B: P(A∪B)=P(A)+P(B)
3. Basic Probability Rules
3.1 Complement Rule
The probability that an event does NOT occur.
P(E′)=1−P(E)
Example: Probability of NOT getting a 6 on a die roll:
P(not 6)=1−P(getting 6)=1−61=65
3.2 Addition Rule
For any two events A and B:
P(A∪B)=P(A)+P(B)−P(A∩B)
Special Case: Mutually Exclusive Events
Events that cannot occur together (no common outcomes).
For mutually exclusive events: P(A∩B)=0
So: P(A∪B)=P(A)+P(B)
Example: Probability of getting 2 OR 5 on a die roll:
These are mutually exclusive: P(2∪5)=P(2)+P(5)=61+61=62=31
3.3 Multiplication Rule
For independent events (occurrence of one doesn't affect the other):
P(A∩B)=P(A)×P(B)
Example: Probability of getting heads on two consecutive coin tosses:
P(H and H)=P(H)×P(H)=21×21=41
4. Conditional Probability
4.1 Definition
Probability of event A occurring given that event B has already occurred.
Notation: P(A∣B) (read as "probability of A given B")
4.2 Formula
P(A∣B)=P(B)P(A∩B), provided P(B)=0
This can be rearranged as: P(A∩B)=P(B)×P(A∣B)
4.3 Understanding with Example
In a class of 30 students: 18 girls, 12 boys. 6 girls and 4 boys wear glasses.
Let: A = student wears glasses, B = student is a girl
Find probability that a student wears glasses given they're a girl:
P(A∣B)=P(B)P(A∩B)=18/306/30=186=31
Direct method: Among 18 girls, 6 wear glasses, so probability = 186=31
4.4 Independent Events using Conditional Probability
Events A and B are independent if:
P(A∣B)=P(A) or P(B∣A)=P(B)
This means knowing B occurred doesn't change probability of A.
Test for independence: A and B are independent if and only if: P(A∩B)=P(A)×P(B)
5. Bayes' Theorem
5.1 Statement
For events A and B with P(B)=0:
P(A∣B)=P(B)P(B∣A)×P(A)
5.2 Extended Form
If B1,B2,…,Bn form a partition of sample space, then:
P(Bi∣A)=∑j=1nP(A∣Bj)×P(Bj)P(A∣Bi)×P(Bi)
5.3 Application Example
Medical Test Problem:
Disease affects 1% of population (P(D) = 0.01)
Test is 95% accurate: P(positive|D) = 0.95, P(negative|no D) = 0.95
Find P(D|positive)
Solution using Bayes':
Let D = has disease, T+ = test positive
We want: P(D∣T+)=P(T+)P(T+∣D)×P(D)
P(T+∣D)=0.95, P(D)=0.01
P(T+)=P(T+∣D)P(D)+P(T+∣noD)P(noD) =0.95×0.01+0.05×0.99=0.0095+0.0495=0.059
So: P(D∣T+)=0.0590.95×0.01=0.0590.0095≈0.161
Only 16.1% chance of having disease given positive test!
6. Random Variables
6.1 Definition
A variable whose values are determined by outcomes of a random experiment.
Notation: Capital letters (X, Y, Z) for random variables, lowercase (x, y, z) for specific values.
6.2 Types of Random Variables
a) Discrete Random Variable
Takes countable number of distinct values.
Examples:
Number of heads in 3 coin tosses (0, 1, 2, 3)
Number of students absent in a class
b) Continuous Random Variable
Takes infinitely many values in an interval (measurable).
Examples:
Height of students
Time taken to complete a task
Temperature
6.3 Probability Distribution
For a discrete random variable X, lists all possible values and their probabilities.
Requirements:
0≤P(X=xi)≤1 for all i
∑P(X=xi)=1
Example: Probability distribution of number of heads in 2 coin tosses:
0
1/4
1
1/2
2
1/4
Total
1
7. Expectation (Mean) and Variance
7.1 Expected Value (Mean)
Average value we expect if experiment is repeated many times.
For discrete random variable X:
E(X)=μ=∑xi⋅P(X=xi)
Example: Expected number of heads in 2 coin tosses:
E(X)=0×41+1×21+2×41=0+0.5+0.5=1
We expect 1 head on average.
7.2 Properties of Expectation
E(c)=c where c is constant
E(cX)=cE(X)
E(X+Y)=E(X)+E(Y)
E(X−Y)=E(X)−E(Y)
For independent X, Y: E(XY)=E(X)E(Y)
7.3 Variance
Measures how spread out the values are from the mean.
Formula: Var(X)=E[(X−μ)2]=E(X2)−[E(X)]2
Calculation formula: Var(X)=∑xi2P(X=xi)−[∑xiP(X=xi)]2
7.4 Standard Deviation
Square root of variance: σ=Var(X)
7.5 Properties of Variance
Var(c)=0 where c is constant
Var(cX)=c2Var(X)
Var(X+c)=Var(X)
For independent X, Y: Var(X+Y)=Var(X)+Var(Y)
8. Important Probability Distributions
8.1 Binomial Distribution
a) Conditions for Binomial Experiment
Fixed number of trials (n)
Each trial has exactly two outcomes: success or failure
Probability of success (p) is constant for each trial
Trials are independent
b) Probability Mass Function
For random variable X = number of successes in n trials:
P(X=r)=(rn)pr(1−p)n−r,r=0,1,2,…,n
where (rn)=r!(n−r)!n! (binomial coefficient)
c) Mean and Variance
Mean: μ=np
Variance: σ2=np(1−p)
Standard Deviation: σ=np(1−p)
d) Example
Probability of getting exactly 3 heads in 5 coin tosses:
n=5, p=0.5, r=3
P(X=3)=(35)(0.5)3(0.5)2=10×0.125×0.25=10×0.03125=0.3125
8.2 Poisson Distribution
a) When to Use
Models number of events occurring in a fixed interval of time/space, when events occur independently at constant average rate.
Examples:
Number of calls at a call center per hour
Number of accidents at an intersection per day
b) Probability Mass Function
P(X=r)=r!e−λλr,r=0,1,2,…
where λ = average number of events in the interval
c) Mean and Variance
Mean = Variance = λ
8.3 Normal Distribution (Gaussian Distribution)
a) Characteristics
Bell-shaped curve
Symmetric about mean
Mean = median = mode
Total area under curve = 1
b) Probability Density Function
f(x)=σ2π1e−21(σx−μ)2,−∞<x<∞
where μ = mean, σ = standard deviation
c) Standard Normal Distribution
Special case with μ=0, σ=1
Z-score: z=σx−μ
Converts any normal distribution to standard normal.
d) Empirical Rule (68-95-99.7 Rule)
For normal distribution:
About 68% of data falls within 1σ of mean
About 95% of data falls within 2σ of mean
About 99.7% of data falls within 3σ of mean
9. Permutations and Combinations in Probability
9.1 Fundamental Counting Principle
If event A can occur in m ways, and event B can occur in n ways, then:
A AND B can occur in m × n ways
A OR B can occur in m + n ways
9.2 Permutations
Arrangements where order matters.
Number of permutations of n distinct objects taken r at a time:
P(n,r)=(n−r)!n!
Special case: All n objects: P(n,n)=n!
9.3 Combinations
Selections where order doesn't matter.
Number of combinations of n distinct objects taken r at a time:
C(n,r)=(rn)=r!(n−r)!n!
9.4 Application in Probability
Example: Probability of getting a specific poker hand.
10. Solved Examples
Example 1: Basic Probability
A bag contains 5 red, 3 blue, and 2 green balls. One ball is drawn at random. Find: a) P(red) b) P(not blue) c) P(red or green)
Solution: Total balls = 5 + 3 + 2 = 10
a) P(red)=105=0.5
b) P(not blue)=1−P(blue)=1−103=107=0.7
c) P(red or green)=P(red)+P(green)=105+102=107=0.7
Example 2: Conditional Probability
In a class, 60% are girls. 40% of girls and 30% of boys wear glasses. If a student wears glasses, what's the probability they're a girl?
Solution: Let: G = girl, B = boy, W = wears glasses
Given: P(G)=0.6, P(B)=0.4, P(W∣G)=0.4, P(W∣B)=0.3
We want: P(G∣W)
By Bayes': P(G∣W)=P(W)P(W∣G)P(G)
P(W)=P(W∣G)P(G)+P(W∣B)P(B) =0.4×0.6+0.3×0.4=0.24+0.12=0.36
So: P(G∣W)=0.360.4×0.6=0.360.24=32≈0.667
Example 3: Binomial Distribution
A test has 10 multiple-choice questions, each with 4 options. If a student guesses randomly, find probability of: a) Exactly 5 correct b) At least 8 correct
Solution: n=10, p=41=0.25 (probability of correct guess)
a) P(X=5)=(510)(0.25)5(0.75)5
=252×0.0009766×0.2373=252×0.0002318=0.0584
b) P(X≥8)=P(X=8)+P(X=9)+P(X=10)
P(X=8)=(810)(0.25)8(0.75)2=45×0.00001526×0.5625=0.000386 P(X=9)=(910)(0.25)9(0.75)1=10×0.000003815×0.75=0.0000286 P(X=10)=(1010)(0.25)10(0.75)0=1×0.000000954=0.000000954
Sum = 0.000386 + 0.0000286 + 0.000000954 = 0.0004155
Example 4: Expected Value
A game: Roll a die. If you get 6, win $10. If you get 4 or 5, win $5. Otherwise, lose $3. What's expected value?
Solution: Let X = winnings
6
$10
1/6
4 or 5
$5
2/6
1,2,3
-$3
3/6
E(X)=10×61+5×62+(−3)×63 =610+610−69=611≈$1.83
Positive expected value means game is favorable on average.
11. Important Formulas Summary
11.1 Basic Probability
P(E)=n(S)n(E)
P(E′)=1−P(E)
P(A∪B)=P(A)+P(B)−P(A∩B)
Mutually exclusive: P(A∩B)=0
11.2 Conditional Probability
P(A∣B)=P(B)P(A∩B)
Independent events: P(A∩B)=P(A)P(B)
11.3 Bayes' Theorem
P(A∣B)=P(B)P(B∣A)P(A)
11.4 Random Variables
Expected value: E(X)=∑xiP(X=xi)
Variance: Var(X)=E(X2)−[E(X)]2
11.5 Binomial Distribution
P(X=r)=(rn)pr(1−p)n−r
Mean: np
Variance: np(1−p)
11.6 Poisson Distribution
P(X=r)=r!e−λλr
Mean = Variance = λ
11.7 Counting
Permutations: P(n,r)=(n−r)!n!
Combinations: C(n,r)=r!(n−r)!n!
12. Exam Tips and Common Mistakes
12.1 Common Mistakes to Avoid
Confusing "and" vs "or":
"A and B" means both occur (intersection)
"A or B" means at least one occurs (union)
Misapplying addition rule: Remember to subtract intersection unless events are mutually exclusive
Confusing independent vs mutually exclusive:
Independent: P(A∩B)=P(A)P(B)
Mutually exclusive: P(A∩B)=0
Forgetting to check conditions for binomial/Poisson distributions
Probability > 1 or < 0: Impossible! Check calculations if this happens
12.2 Problem-Solving Strategy
Define events clearly: Write what each event represents
Identify what's asked: "Given that", "and", "or", etc.
Choose correct formula: Based on conditions
Check independence/mutual exclusivity
Draw Venn diagrams or tree diagrams for visualization
12.3 Quick Checks
Total probability always sums to 1
Conditional probability: 0≤P(A∣B)≤1
Complement: P(A)+P(A′)=1
Expected value interpretation: Long-run average
Variance: Always non-negative
13. Real-World Applications
13.1 Everyday Life
Weather forecasting: Probability of rain
Games of chance: Cards, dice, lotteries
Insurance: Calculating premiums based on risk
13.2 Science and Engineering
Quality control: Probability of defective items
Reliability engineering: Probability of system failure
Medical testing: Sensitivity and specificity
13.3 Finance
Stock market: Probability of price movements
Risk assessment: Probability of loan default
Portfolio management: Expected returns
This comprehensive theory covers all aspects of probability with detailed explanations and examples, making it easy to understand while being thorough enough for exam preparation.
Last updated