5.2 Probability

Detailed Theory: Probability

1. Basic Concepts and Terminology

1.1 What is Probability?

Probability is a branch of mathematics that deals with calculating the likelihood of events occurring. It quantifies uncertainty.

Key Idea: Probability is a number between 0 and 1 that represents how likely an event is to occur.

  • 0 means the event is impossible

  • 1 means the event is certain

  • 0.5 means the event is equally likely to occur or not occur

1.2 Important Terms

a) Experiment

An action or process that leads to one of several possible outcomes.

Examples:

  • Tossing a coin

  • Rolling a die

  • Drawing a card from a deck

b) Sample Space (S)

The set of all possible outcomes of an experiment.

Examples:

  1. Tossing a coin: S={H,T}S = \{H, T\} (H = Head, T = Tail)

  2. Rolling a die: S={1,2,3,4,5,6}S = \{1, 2, 3, 4, 5, 6\}

  3. Tossing two coins: S={HH,HT,TH,TT}S = \{HH, HT, TH, TT\}

c) Event (E)

A subset of the sample space (one or more outcomes).

Examples for die roll:

  • Event A: Getting an even number = {2,4,6}\{2, 4, 6\}

  • Event B: Getting a number > 4 = {5,6}\{5, 6\}

d) Types of Events

  1. Simple Event: Single outcome (e.g., getting a 3 on die)

  2. Compound Event: Multiple outcomes (e.g., getting even number)

  3. Impossible Event: Event that cannot occur (probability = 0)

  4. Sure/Certain Event: Event that always occurs (probability = 1)

  5. Complementary Event (E' or E⁰): All outcomes NOT in E


2. Different Approaches to Probability

2.1 Classical (Theoretical) Probability

Used when all outcomes are equally likely.

Formula:

P(E)=Number of favorable outcomesTotal number of outcomes=n(E)n(S)P(E) = \frac{\text{Number of favorable outcomes}}{\text{Total number of outcomes}} = \frac{n(E)}{n(S)}

Example: Probability of getting an even number when rolling a die:

S={1,2,3,4,5,6}n(S)=6S = \{1, 2, 3, 4, 5, 6\} \Rightarrow n(S) = 6

E={2,4,6}n(E)=3E = \{2, 4, 6\} \Rightarrow n(E) = 3

P(E)=36=12=0.5P(E) = \frac{3}{6} = \frac{1}{2} = 0.5

2.2 Empirical (Experimental) Probability

Based on actual experiments or observations.

Formula:

P(E)=Number of times event occurredTotal number of trialsP(E) = \frac{\text{Number of times event occurred}}{\text{Total number of trials}}

Example: In 100 coin tosses, heads appeared 47 times.

Empirical probability of heads = 47100=0.47\frac{47}{100} = 0.47

2.3 Axiomatic Probability

Based on three axioms (rules):

Axiom 1: 0P(E)10 \leq P(E) \leq 1 for any event E

Axiom 2: P(S)=1P(S) = 1 (probability of sample space is 1)

Axiom 3: For mutually exclusive events A and B: P(AB)=P(A)+P(B)P(A \cup B) = P(A) + P(B)


3. Basic Probability Rules

3.1 Complement Rule

The probability that an event does NOT occur.

P(E)=1P(E)P(E') = 1 - P(E)

Example: Probability of NOT getting a 6 on a die roll:

P(not 6)=1P(getting 6)=116=56P(\text{not 6}) = 1 - P(\text{getting 6}) = 1 - \frac{1}{6} = \frac{5}{6}

3.2 Addition Rule

For any two events A and B:

P(AB)=P(A)+P(B)P(AB)P(A \cup B) = P(A) + P(B) - P(A \cap B)

Special Case: Mutually Exclusive Events

Events that cannot occur together (no common outcomes).

For mutually exclusive events: P(AB)=0P(A \cap B) = 0

So: P(AB)=P(A)+P(B)P(A \cup B) = P(A) + P(B)

Example: Probability of getting 2 OR 5 on a die roll:

These are mutually exclusive: P(25)=P(2)+P(5)=16+16=26=13P(2 \cup 5) = P(2) + P(5) = \frac{1}{6} + \frac{1}{6} = \frac{2}{6} = \frac{1}{3}

3.3 Multiplication Rule

For independent events (occurrence of one doesn't affect the other):

P(AB)=P(A)×P(B)P(A \cap B) = P(A) \times P(B)

Example: Probability of getting heads on two consecutive coin tosses:

P(H and H)=P(H)×P(H)=12×12=14P(H \text{ and } H) = P(H) \times P(H) = \frac{1}{2} \times \frac{1}{2} = \frac{1}{4}


4. Conditional Probability

4.1 Definition

Probability of event A occurring given that event B has already occurred.

Notation: P(AB)P(A|B) (read as "probability of A given B")

4.2 Formula

P(AB)=P(AB)P(B)P(A|B) = \frac{P(A \cap B)}{P(B)}, provided P(B)0P(B) \neq 0

This can be rearranged as: P(AB)=P(B)×P(AB)P(A \cap B) = P(B) \times P(A|B)

4.3 Understanding with Example

In a class of 30 students: 18 girls, 12 boys. 6 girls and 4 boys wear glasses.

Let: A = student wears glasses, B = student is a girl

Find probability that a student wears glasses given they're a girl:

P(AB)=P(AB)P(B)=6/3018/30=618=13P(A|B) = \frac{P(A \cap B)}{P(B)} = \frac{6/30}{18/30} = \frac{6}{18} = \frac{1}{3}

Direct method: Among 18 girls, 6 wear glasses, so probability = 618=13\frac{6}{18} = \frac{1}{3}

4.4 Independent Events using Conditional Probability

Events A and B are independent if:

P(AB)=P(A)P(A|B) = P(A) or P(BA)=P(B)P(B|A) = P(B)

This means knowing B occurred doesn't change probability of A.

Test for independence: A and B are independent if and only if: P(AB)=P(A)×P(B)P(A \cap B) = P(A) \times P(B)


5. Bayes' Theorem

5.1 Statement

For events A and B with P(B)0P(B) \neq 0:

P(AB)=P(BA)×P(A)P(B)P(A|B) = \frac{P(B|A) \times P(A)}{P(B)}

5.2 Extended Form

If B1,B2,,BnB_1, B_2, \ldots, B_n form a partition of sample space, then:

P(BiA)=P(ABi)×P(Bi)j=1nP(ABj)×P(Bj)P(B_i|A) = \frac{P(A|B_i) \times P(B_i)}{\sum_{j=1}^{n} P(A|B_j) \times P(B_j)}

5.3 Application Example

Medical Test Problem:

  • Disease affects 1% of population (P(D) = 0.01)

  • Test is 95% accurate: P(positive|D) = 0.95, P(negative|no D) = 0.95

  • Find P(D|positive)

Solution using Bayes':

Let D = has disease, T+ = test positive

We want: P(DT+)=P(T+D)×P(D)P(T+)P(D|T+) = \frac{P(T+|D) \times P(D)}{P(T+)}

P(T+D)=0.95P(T+|D) = 0.95, P(D)=0.01P(D) = 0.01

P(T+)=P(T+D)P(D)+P(T+noD)P(noD)P(T+) = P(T+|D)P(D) + P(T+|no D)P(no D) =0.95×0.01+0.05×0.99=0.0095+0.0495=0.059= 0.95 \times 0.01 + 0.05 \times 0.99 = 0.0095 + 0.0495 = 0.059

So: P(DT+)=0.95×0.010.059=0.00950.0590.161P(D|T+) = \frac{0.95 \times 0.01}{0.059} = \frac{0.0095}{0.059} \approx 0.161

Only 16.1% chance of having disease given positive test!


6. Random Variables

6.1 Definition

A variable whose values are determined by outcomes of a random experiment.

Notation: Capital letters (X, Y, Z) for random variables, lowercase (x, y, z) for specific values.

6.2 Types of Random Variables

a) Discrete Random Variable

Takes countable number of distinct values.

Examples:

  • Number of heads in 3 coin tosses (0, 1, 2, 3)

  • Number of students absent in a class

b) Continuous Random Variable

Takes infinitely many values in an interval (measurable).

Examples:

  • Height of students

  • Time taken to complete a task

  • Temperature

6.3 Probability Distribution

For a discrete random variable X, lists all possible values and their probabilities.

Requirements:

  1. 0P(X=xi)10 \leq P(X = x_i) \leq 1 for all i

  2. P(X=xi)=1\sum P(X = x_i) = 1

Example: Probability distribution of number of heads in 2 coin tosses:

X (number of heads)
P(X)

0

1/4

1

1/2

2

1/4

Total

1


7. Expectation (Mean) and Variance

7.1 Expected Value (Mean)

Average value we expect if experiment is repeated many times.

For discrete random variable X:

E(X)=μ=xiP(X=xi)E(X) = \mu = \sum x_i \cdot P(X = x_i)

Example: Expected number of heads in 2 coin tosses:

E(X)=0×14+1×12+2×14=0+0.5+0.5=1E(X) = 0 \times \frac{1}{4} + 1 \times \frac{1}{2} + 2 \times \frac{1}{4} = 0 + 0.5 + 0.5 = 1

We expect 1 head on average.

7.2 Properties of Expectation

  1. E(c)=cE(c) = c where c is constant

  2. E(cX)=cE(X)E(cX) = cE(X)

  3. E(X+Y)=E(X)+E(Y)E(X + Y) = E(X) + E(Y)

  4. E(XY)=E(X)E(Y)E(X - Y) = E(X) - E(Y)

  5. For independent X, Y: E(XY)=E(X)E(Y)E(XY) = E(X)E(Y)

7.3 Variance

Measures how spread out the values are from the mean.

Formula: Var(X)=E[(Xμ)2]=E(X2)[E(X)]2\text{Var}(X) = E[(X - \mu)^2] = E(X^2) - [E(X)]^2

Calculation formula: Var(X)=xi2P(X=xi)[xiP(X=xi)]2\text{Var}(X) = \sum x_i^2 P(X = x_i) - [\sum x_i P(X = x_i)]^2

7.4 Standard Deviation

Square root of variance: σ=Var(X)\sigma = \sqrt{\text{Var}(X)}

7.5 Properties of Variance

  1. Var(c)=0\text{Var}(c) = 0 where c is constant

  2. Var(cX)=c2Var(X)\text{Var}(cX) = c^2 \text{Var}(X)

  3. Var(X+c)=Var(X)\text{Var}(X + c) = \text{Var}(X)

  4. For independent X, Y: Var(X+Y)=Var(X)+Var(Y)\text{Var}(X + Y) = \text{Var}(X) + \text{Var}(Y)


8. Important Probability Distributions

8.1 Binomial Distribution

a) Conditions for Binomial Experiment

  1. Fixed number of trials (n)

  2. Each trial has exactly two outcomes: success or failure

  3. Probability of success (p) is constant for each trial

  4. Trials are independent

b) Probability Mass Function

For random variable X = number of successes in n trials:

P(X=r)=(nr)pr(1p)nr,r=0,1,2,,nP(X = r) = \binom{n}{r} p^r (1-p)^{n-r}, \quad r = 0, 1, 2, \ldots, n

where (nr)=n!r!(nr)!\binom{n}{r} = \frac{n!}{r!(n-r)!} (binomial coefficient)

c) Mean and Variance

Mean: μ=np\mu = np

Variance: σ2=np(1p)\sigma^2 = np(1-p)

Standard Deviation: σ=np(1p)\sigma = \sqrt{np(1-p)}

d) Example

Probability of getting exactly 3 heads in 5 coin tosses:

n=5n=5, p=0.5p=0.5, r=3r=3

P(X=3)=(53)(0.5)3(0.5)2=10×0.125×0.25=10×0.03125=0.3125P(X=3) = \binom{5}{3} (0.5)^3 (0.5)^2 = 10 \times 0.125 \times 0.25 = 10 \times 0.03125 = 0.3125

8.2 Poisson Distribution

a) When to Use

Models number of events occurring in a fixed interval of time/space, when events occur independently at constant average rate.

Examples:

  • Number of calls at a call center per hour

  • Number of accidents at an intersection per day

b) Probability Mass Function

P(X=r)=eλλrr!,r=0,1,2,P(X = r) = \frac{e^{-\lambda} \lambda^r}{r!}, \quad r = 0, 1, 2, \ldots

where λ\lambda = average number of events in the interval

c) Mean and Variance

Mean = Variance = λ\lambda

8.3 Normal Distribution (Gaussian Distribution)

a) Characteristics

  • Bell-shaped curve

  • Symmetric about mean

  • Mean = median = mode

  • Total area under curve = 1

b) Probability Density Function

f(x)=1σ2πe12(xμσ)2,<x<f(x) = \frac{1}{\sigma\sqrt{2\pi}} e^{-\frac{1}{2}\left(\frac{x-\mu}{\sigma}\right)^2}, \quad -\infty < x < \infty

where μ\mu = mean, σ\sigma = standard deviation

c) Standard Normal Distribution

Special case with μ=0\mu = 0, σ=1\sigma = 1

Z-score: z=xμσz = \frac{x - \mu}{\sigma}

Converts any normal distribution to standard normal.

d) Empirical Rule (68-95-99.7 Rule)

For normal distribution:

  • About 68% of data falls within 1σ of mean

  • About 95% of data falls within 2σ of mean

  • About 99.7% of data falls within 3σ of mean


9. Permutations and Combinations in Probability

9.1 Fundamental Counting Principle

If event A can occur in m ways, and event B can occur in n ways, then:

  • A AND B can occur in m × n ways

  • A OR B can occur in m + n ways

9.2 Permutations

Arrangements where order matters.

Number of permutations of n distinct objects taken r at a time:

P(n,r)=n!(nr)!P(n, r) = \frac{n!}{(n-r)!}

Special case: All n objects: P(n,n)=n!P(n, n) = n!

9.3 Combinations

Selections where order doesn't matter.

Number of combinations of n distinct objects taken r at a time:

C(n,r)=(nr)=n!r!(nr)!C(n, r) = \binom{n}{r} = \frac{n!}{r!(n-r)!}

9.4 Application in Probability

Example: Probability of getting a specific poker hand.


10. Solved Examples

Example 1: Basic Probability

A bag contains 5 red, 3 blue, and 2 green balls. One ball is drawn at random. Find: a) P(red) b) P(not blue) c) P(red or green)

Solution: Total balls = 5 + 3 + 2 = 10

a) P(red)=510=0.5P(\text{red}) = \frac{5}{10} = 0.5

b) P(not blue)=1P(blue)=1310=710=0.7P(\text{not blue}) = 1 - P(\text{blue}) = 1 - \frac{3}{10} = \frac{7}{10} = 0.7

c) P(red or green)=P(red)+P(green)=510+210=710=0.7P(\text{red or green}) = P(\text{red}) + P(\text{green}) = \frac{5}{10} + \frac{2}{10} = \frac{7}{10} = 0.7

Example 2: Conditional Probability

In a class, 60% are girls. 40% of girls and 30% of boys wear glasses. If a student wears glasses, what's the probability they're a girl?

Solution: Let: G = girl, B = boy, W = wears glasses

Given: P(G)=0.6P(G) = 0.6, P(B)=0.4P(B) = 0.4, P(WG)=0.4P(W|G) = 0.4, P(WB)=0.3P(W|B) = 0.3

We want: P(GW)P(G|W)

By Bayes': P(GW)=P(WG)P(G)P(W)P(G|W) = \frac{P(W|G)P(G)}{P(W)}

P(W)=P(WG)P(G)+P(WB)P(B)P(W) = P(W|G)P(G) + P(W|B)P(B) =0.4×0.6+0.3×0.4=0.24+0.12=0.36= 0.4 \times 0.6 + 0.3 \times 0.4 = 0.24 + 0.12 = 0.36

So: P(GW)=0.4×0.60.36=0.240.36=230.667P(G|W) = \frac{0.4 \times 0.6}{0.36} = \frac{0.24}{0.36} = \frac{2}{3} \approx 0.667

Example 3: Binomial Distribution

A test has 10 multiple-choice questions, each with 4 options. If a student guesses randomly, find probability of: a) Exactly 5 correct b) At least 8 correct

Solution: n=10n=10, p=14=0.25p = \frac{1}{4} = 0.25 (probability of correct guess)

a) P(X=5)=(105)(0.25)5(0.75)5P(X=5) = \binom{10}{5} (0.25)^5 (0.75)^5

=252×0.0009766×0.2373=252×0.0002318=0.0584= 252 \times 0.0009766 \times 0.2373 = 252 \times 0.0002318 = 0.0584

b) P(X8)=P(X=8)+P(X=9)+P(X=10)P(X \geq 8) = P(X=8) + P(X=9) + P(X=10)

P(X=8)=(108)(0.25)8(0.75)2=45×0.00001526×0.5625=0.000386P(X=8) = \binom{10}{8} (0.25)^8 (0.75)^2 = 45 \times 0.00001526 \times 0.5625 = 0.000386 P(X=9)=(109)(0.25)9(0.75)1=10×0.000003815×0.75=0.0000286P(X=9) = \binom{10}{9} (0.25)^9 (0.75)^1 = 10 \times 0.000003815 \times 0.75 = 0.0000286 P(X=10)=(1010)(0.25)10(0.75)0=1×0.000000954=0.000000954P(X=10) = \binom{10}{10} (0.25)^{10} (0.75)^0 = 1 \times 0.000000954 = 0.000000954

Sum = 0.000386 + 0.0000286 + 0.000000954 = 0.0004155

Example 4: Expected Value

A game: Roll a die. If you get 6, win $10. If you get 4 or 5, win $5. Otherwise, lose $3. What's expected value?

Solution: Let X = winnings

Outcome
X (winnings)
P(X)

6

$10

1/6

4 or 5

$5

2/6

1,2,3

-$3

3/6

E(X)=10×16+5×26+(3)×36E(X) = 10 \times \frac{1}{6} + 5 \times \frac{2}{6} + (-3) \times \frac{3}{6} =106+10696=116$1.83= \frac{10}{6} + \frac{10}{6} - \frac{9}{6} = \frac{11}{6} \approx \$1.83

Positive expected value means game is favorable on average.


11. Important Formulas Summary

11.1 Basic Probability

  • P(E)=n(E)n(S)P(E) = \frac{n(E)}{n(S)}

  • P(E)=1P(E)P(E') = 1 - P(E)

  • P(AB)=P(A)+P(B)P(AB)P(A \cup B) = P(A) + P(B) - P(A \cap B)

  • Mutually exclusive: P(AB)=0P(A \cap B) = 0

11.2 Conditional Probability

  • P(AB)=P(AB)P(B)P(A|B) = \frac{P(A \cap B)}{P(B)}

  • Independent events: P(AB)=P(A)P(B)P(A \cap B) = P(A)P(B)

11.3 Bayes' Theorem

  • P(AB)=P(BA)P(A)P(B)P(A|B) = \frac{P(B|A)P(A)}{P(B)}

11.4 Random Variables

  • Expected value: E(X)=xiP(X=xi)E(X) = \sum x_i P(X = x_i)

  • Variance: Var(X)=E(X2)[E(X)]2\text{Var}(X) = E(X^2) - [E(X)]^2

11.5 Binomial Distribution

  • P(X=r)=(nr)pr(1p)nrP(X=r) = \binom{n}{r} p^r (1-p)^{n-r}

  • Mean: npnp

  • Variance: np(1p)np(1-p)

11.6 Poisson Distribution

  • P(X=r)=eλλrr!P(X=r) = \frac{e^{-\lambda} \lambda^r}{r!}

  • Mean = Variance = λ\lambda

11.7 Counting

  • Permutations: P(n,r)=n!(nr)!P(n,r) = \frac{n!}{(n-r)!}

  • Combinations: C(n,r)=n!r!(nr)!C(n,r) = \frac{n!}{r!(n-r)!}


12. Exam Tips and Common Mistakes

12.1 Common Mistakes to Avoid

  1. Confusing "and" vs "or":

    • "A and B" means both occur (intersection)

    • "A or B" means at least one occurs (union)

  2. Misapplying addition rule: Remember to subtract intersection unless events are mutually exclusive

  3. Confusing independent vs mutually exclusive:

    • Independent: P(AB)=P(A)P(B)P(A \cap B) = P(A)P(B)

    • Mutually exclusive: P(AB)=0P(A \cap B) = 0

  4. Forgetting to check conditions for binomial/Poisson distributions

  5. Probability > 1 or < 0: Impossible! Check calculations if this happens

12.2 Problem-Solving Strategy

  1. Define events clearly: Write what each event represents

  2. Identify what's asked: "Given that", "and", "or", etc.

  3. Choose correct formula: Based on conditions

  4. Check independence/mutual exclusivity

  5. Draw Venn diagrams or tree diagrams for visualization

12.3 Quick Checks

  1. Total probability always sums to 1

  2. Conditional probability: 0P(AB)10 \leq P(A|B) \leq 1

  3. Complement: P(A)+P(A)=1P(A) + P(A') = 1

  4. Expected value interpretation: Long-run average

  5. Variance: Always non-negative


13. Real-World Applications

13.1 Everyday Life

  1. Weather forecasting: Probability of rain

  2. Games of chance: Cards, dice, lotteries

  3. Insurance: Calculating premiums based on risk

13.2 Science and Engineering

  1. Quality control: Probability of defective items

  2. Reliability engineering: Probability of system failure

  3. Medical testing: Sensitivity and specificity

13.3 Finance

  1. Stock market: Probability of price movements

  2. Risk assessment: Probability of loan default

  3. Portfolio management: Expected returns

This comprehensive theory covers all aspects of probability with detailed explanations and examples, making it easy to understand while being thorough enough for exam preparation.

Last updated