Probability

Probability


Outline


Events and their probabilities

\[\Omega = \text{sample space}\]
\[\emptyset = \text{empty event}\]
\[P\{E\} = \text{probability of event E}\]

Example 1

A tossed die can produce one of 6 possible outcomes: 1 dot through 6 dots.


Set Operations

\[A \cup B = \text{union} \quad A \cap B = \text{intersection}\]
\[\overline{A} \text{ or } A^c = \text{complement} \quad A \backslash B = \text{difference}\]

setops


Events: Disjoint, mutually exclusive, exhaustive

\[A \cap B = \emptyset\]
\[A_i \cap A_j = \emptyset, \text{ for any } i \neq j\]
\[A \cup B \cup C \cup \cdots = \Omega\]

Example 2

Receiving a grade of A, B, or C for some course are mutually exclusive events, but unfortunately, they are not exhaustive.


De Morgan's laws

\[\overline{A \cup B} = \overline{A} \cap \overline{B}\]
\[\overline{A \cap B} = \overline{A} \cup \overline{B}\]

Rules of Probability

A collection $\mathfrak{M}$ of events is a sigma-algebra on sample space $\Omega$ if

\[\mathfrak{M} \in \Omega\]
\[E \in \mathfrak{M} \Rightarrow \overline{E} \in \mathfrak{M}\]
\[E_1, E_2, \ldots \in \mathfrak{M} \Rightarrow E_1 \cup E_2 \cup \ldots \in \mathfrak{M}\]

Example 3


Axioms of Probability

Assume a sample space $\Omega$ and a sigma-algebra of events $\mathfrak{M}$ on it. Probability is a function of events with the domain $\mathfrak{M}$ and the range [0, 1]

\[P : \mathfrak{M} \rightarrow [0, 1]\]

that satisfies the following two conditions,

\[P(\Omega) = 1\]
\[P \{E_1 \cup E_2 \cup \ldots \} = P(E_1) + P(E_2) + \ldots\]

Computing probabilities of events

A sample space $\Omega$ consists of all possible outcomes, therefore, it occurs for sure, $P \{\Omega\} = 1$

On the contrary, an empty event $\emptyset$ never occurs, $P \{\emptyset\} = 0$

Consider an event that consists of some finite or countable collection of mutually exclusive outcomes, $E=\{ \omega_1, \omega_2, \ldots \}$,

\[P \{E\} = \sum_{\omega_k \in E} P\{\omega_k\}\]

Computing probabilities of events (cont.)

Probability of a union

\[P \{A \cup B \} = P \{A\} + P \{B\} - P \{A \cap B \}\]

Complement

\[P \{\overline{A} \} = 1 - P \{A\}\]

Intersection of independent events

Events $E_1, E_2, \ldots, E_n$ are independent if they occur independently of each other, i.e., occurrence of one event does not affect the probabilities of others.

\[P \{E_1 \cap \ldots \cap E_n \} = P \{ E_1 \} \cdot \ldots P \{ E_n \}\]

Example 4

There is a 1% probability for a hard drive to crash. Therefore, it has two backups, each having a 2% probability to crash, and all three components are independent of each other. The stored information is lost only in an unfortunate situation when all three devices crash. What is the probability that the information is saved?


Equally likely outcomes

When the sample space $\Omega$ consists of $n$ possible outcomes, $\omega_1, \ldots, \omega_n$, each having the same probability.

Further, a probability of any event $E$ consisting of $t$ outcomes, equals

\[P \{E\} = \sum_{\omega_k \in E} \left(\frac{1}{n}\right) = t\left(\frac{1}{n}\right) = \frac{\#\text{ of favorable outcomes}}{\# \text{ of total outcomes}}\]

Example 5

Tossing a die results in 6 equally likely possible outcomes, identified by the number of dots from 1 to 6.


Combinatorics

Sampling with replacement means that every sampled item is replaced into the initial set, so that any of the objects can be selected with probability 1/n at any time. In particular, the same object may be sampled more than once.

\[P_r(n, k) = n^k\]

Permutations

Sampling without replacement means that every sampled item is removed from further sampling, so the set of possibilities reduces by 1 after each selection.

\[P(n, k) = \frac{n!}{(n-k)!}\]

Combinations

\[C_r(n, k) = \binom{k+n-1}{k} = \frac{(k+n-1)!}{k!(n-1)!}\]
\[C(n, k) = \binom{n}{k} = \frac{n!}{k!(n-k)!}\]

Conditional probability

Conditional probability of event $A$ given event B is the probability that $A$ occurs when $B$ is known to occur.

\[P \{A \;|\; B\} = \text{conditional probability of A given B}\]
\[P \{A \;|\; B\} = \frac{\#\text{ of outcomes in } A \cap B}{\# \text{ of outcomes in } B} = \frac{ P\{A \cap B\}}{ P\{B\}}\]

Independence

Events $A$ and $B$ are independent if occurrence of $B$ does not affect the probability of $A$, i.e.,

\[P \{A \;|\; B\} = P \{A\}\]

Example 6

Ninety percent of flights depart on time. Eighty percent of flights arrive on time. Seventy-five percent of flights depart on time and arrive on time.


Bayes Rule

Because $A \cap B = B \cap A$, thus

\[P \{B \;|\; A\} = \frac{ P\{A \;|\; B\}P\{B\}}{ P\{A\} }\]

Example 7

On a midterm exam, students X, Y , and Z forgot to sign their papers. Professor knows that they can write a good exam with probabilities 0.8, 0.7, and 0.5, respectively. After the grading, he notices that two unsigned exams are good and one is bad. Given this information, and assuming that students worked independently of each other, what is the probability that the bad exam belongs to student Z?


Law of Total Probability

Consider some partition of the sample space $\Omega$ with mutually exclusive and exhaustive events $B_1 , \ldots, B_k$. It means that

\[B_i\cap B_j = \emptyset, \; \forall i\neq j \text{ and } B_1 \cup \ldots \cup B_k = \Omega\]

These events also partition the event $A$,

\[A = (A \cap B_1) \cup \ldots \cup (A \cap B_k)\]

Hence,

\[P\{A\} = \sum_{j=1}^k P \{ A \;|\; B_j\} P\{B_j\}\]

Example 8

There exists a test for a certain viral infection (including a virus attack on a computer network). It is 95% reliable for infected patients and 99% reliable for the healthy ones. Suppose that 4% of all the patients are infected with the virus. If the test shows positive results, what is the probability that a patient has the virus?