\[\newcommand{\E}{\mathrm{E}} \newcommand{\Var}{\mathrm{Var}} \newcommand{\Cov}{\mathrm{Cov}}\]
A random variable \(X\) that is a function of the sample space
\[ \begin{eqnarray} X:\text{Sample Space} \rightarrow \mathcal{R} \end{eqnarray} \]
Number of casualties in a war (rather than all possible outcomes)
Defining the function
\[ \begin{equation} X = \left \{ \begin{array} {ll} 0 \text{ if } (C, C, C) \\ 1 \text{ if } (T, C, C) \text{ or } (C, T, C) \text{ or } (C, C, T) \\ 2 \text{ if } (T, T, C) \text{ or } (T, C, T) \text{ or } (C, T, T) \\ 3 \text{ if } (T, T, T) \end{array} \right. \end{equation} \]In other words:
\[ \begin{eqnarray} X( (C, C, C) ) & = & 0 \\ X( (T, C, C)) & = & 1 \\ X((T, C, T)) & = & 2 \\ X((T, T, T)) & = & 3 \end{eqnarray} \]
This applies to all outcomes
\[ \begin{eqnarray} p(X = 0) & = & P(C, C, C) = \frac{1}{8}\\ p(X = 1) & = & P(T, C, C) + P(C, T, C) + P(C, C, T) = \frac{3}{8} \\ p(X = 2) & = & P(T, T, C) + P(T, C, T) + P(C, T, T) = \frac{3}{8} \\ p(X = 3) & = & P(T, T, T) = \frac{1}{8} \end{eqnarray} \]\(p(X = a) = 0\), for all \(a \notin (0, 1, 2, 3)\)
Probability Mass Function: For a discrete random variable \(X\), define the probability mass function \(p_X(x)\) as
\[ \begin{eqnarray} p(x) & = & P(X = x) \end{eqnarray} \]Note that
\[\sum_x p_{X}(x) = 1\]
Can also add probabilities for smaller sets \(S\) of possible values of \(XS\)
\[\Pr(X \in S) = \sum_{x \in S} p_X (x)\]For example, if \(X\) is the number of heads obtained in two independent tosses of a fair coin, the probability of at least one head is
\[\Pr (X > 0) = \sum_{x=1}^2 p_X (x) = \frac{1}{2} + \frac{1}{4} = \frac{3}{4}\]
Suppose \(X\) is a random variable, with \(X \in \{0, 1\}\) and \(P(X = 1) = \pi\). Then we will say that \(X\) is Bernoulli random variable,
\[ \begin{eqnarray} p_X(k) & = & \pi^{k} (1- \pi)^{1 - k} \nonumber \end{eqnarray} \]
for \(k \in \{0,1\}\) and \(p(k) = 0\) otherwise.
We will (equivalently) say that
\[ \begin{eqnarray} Y & \sim & \text{Bernoulli}(\pi) \nonumber \end{eqnarray} \]
Suppose we flip a fair coin and \(Y = 1\) if the outcome is Heads.
\[ \begin{eqnarray} Y & \sim & \text{Bernoulli}(1/2) \nonumber \\ p(1) & = & (1/2)^{1} (1- 1/2)^{ 1- 1} = 1/2 \nonumber \\ p(0) & = & (1/2)^{0} (1- 1/2)^{1 - 0} = (1- 1/2) \nonumber \end{eqnarray} \]
Suppose \(X\) is a random variable that counts the number of successes in \(N\) independent and identically distributed Bernoulli trials. Then \(X\) is a Binomial random variable,
\[ \begin{eqnarray} p_X(k) & = & {{N}\choose{k}}\pi^{k} (1- \pi)^{n-k} \nonumber \end{eqnarray} \]
for \(k \in \{0, 1, 2, \ldots, N\}\) and \(p(k) = 0\) otherwise, and \(\binom{N}{k} = \frac{N!}{(N-k)! k!}\). Equivalently,
\[ \begin{eqnarray} Y & \sim & \text{Binomial}(N, \pi) \nonumber \end{eqnarray} \]
\(Z =\) number of units assigned to treatment
\[ \begin{eqnarray} Z & \sim & \text{Binomial}(1/2)\\ p(0) & = & {{3}\choose{0}} (1/2)^{0} (1- 1/2)^{3-0} = 1 \times \frac{1}{8}\\ p(1) & = & {{3}\choose{1}} (1/2)^{1} (1 - 1/2)^{2} = 3 \times \frac{1}{8} \\ p(2) & = & {{3}\choose{2}} (1/2)^{2} (1- 1/2)^1 = 3 \times \frac{1}{8} \\ p(3) & = & {{3}\choose{3}} (1/2)^{3} (1 - 1/2)^{0} = 1 \times \frac{1}{8} \end{eqnarray} \]
Suppose \(X\) is a random variable that counts the number of tosses needed for a head to come up the first time. Its PMF is
\[ \begin{eqnarray} p_X(k) & = & (1 - p)^{k-1}p, \quad k = 1, 2, \ldots \end{eqnarray} \]\((1 - p)^{k-1}p\) is the probability of the sequence consisting of \(k-1\) successive trials followed by a head. This is a valid PMF because
\[ \begin{align} \sum_{k=1}^{\infty} p_X(k) &= \sum_{k=1}^{\infty} (1 - p)^{k-1}p \\ &= p \sum_{k=1}^{\infty} (1 - p)^{k-1} \\& = p \times \frac{1}{1 - (1-p)} \\ &= 1 \end{align} \]
Suppose \(X\) is a random variable that takes on values \(X \in \{0, 1, 2, \ldots, \}\) and that \(\Pr(X = k) = p_X(k)\) is,
\[ \begin{eqnarray} p_X(k) & = & e^{-\lambda} \frac{\lambda^{k}}{k!}, \quad k = 0,1,2,\ldots \end{eqnarray} \]
for \(k \in \{0, 1, \ldots, \}\) and \(0\) otherwise.\(X\) follows a Poisson distribution with rate parameter \(\lambda\)
\[ \begin{eqnarray} X & \sim & \text{Poisson}(\lambda) \nonumber \end{eqnarray} \]
What is the probability the president will make ten or more threats?
\[ \begin{eqnarray} P(X = 10) & = & e^{-\lambda} \frac{5^{10}}{10!} \end{eqnarray} \]
The Poisson PMF with parameter \(\lambda\) is a good approximation for a binomial PMF with parameters \(n\) and \(p\)
\[e^{-\lambda} \frac{\lambda^{k}}{k!} \approx {{N}\choose{k}}\pi^{k} (1- \pi)^{n-k}, \quad \text{if } k \ll n\]
Sometimes using the Poisson PMF results in simpler models and easier calculations
Using the binomial PMF
\[\frac{100!}{95! 5!} \times 0.01^5 (1 - 0.01)^{95} = 0.00290\]
Using the Poisson PMF with \(\lambda = np = 100 \times 0.01 = 1\)
\[e^{-1} \frac{1}{5!} = 0.00306\]
Consider spinning a wheel of fortune many times. At each spin, one of the numbers \(m_1, m_2, \ldots, m_n\) comes up with corresponding probability \(p_1, p_2, \ldots, p_n\), and this is your monetary reward from that spin. What is the amount of money that you expect to get per spin?
Expected value (known as expectation or the mean) of a random variable \(X\), with PMF \(p_X\) is
\[ \begin{eqnarray} \E[X] & = & \sum_{x:p(x)>0} x p(x) \end{eqnarray} \]
where \(\sum_{x:p(x)>0}\) is all values of \(X\) with probability greater than 0
Suppose \(X\) is number of units assigned to treatment, in one of our previous example.
\[ \begin{equation} X = \left \{ \begin{array} {ll} 0 \text{ if } (C, C, C) \\ 1 \text{ if } (T, C, C) \text{ or } (C, T, C) \text{ or } (C, C, T) \\ 2 \text{ if } (T, T, C) \text{ or } (T, C, T) \text{ or } (C, T, T) \\ 3 \text{ if } (T, T, T) \end{array} \right. \end{equation} \]
What is \(E[X]\)?
\[ \begin{eqnarray} \E[X] & = & 0\times \frac{1}{8} + 1 \times \frac{3}{8} + 2 \times \frac{3}{8} + 3 \times \frac{1}{8} \\ & = & 1.5 \end{eqnarray} \]
Gives us a measure of central tendency
Defined as the expected value of the random variable \((X - \E[X])^2\)
\[ \begin{align} \Var(X) &= \E[(X - \E[X])^2] \end{align} \]
We will define the standard deviation of \(X\), \(\sigma_X = \sqrt{\Var(X)}\)
Let \(X\) be a random variable with PMF \(p_X\), and let \(g(X)\) be a function of \(X\). Then, the expected value of the random variable \(g(X)\) is given by
\[\E[g(X)] = \sum_{x} g(x) p_X(x)\]Rewrite our variance formula:
\[ \begin{align} \Var(X) &= \E[(X - \E[X])^2] \\ \Var(X) &= \E[X^2] - \E[X]^2 \end{align} \]
Suppose \(Y \sim \text{Bernoulli}(\pi)\)
\[ \begin{eqnarray} E[Y] & = & 1 \times P(Y = 1) + 0 \times P(Y = 0) \nonumber \\ & = & \pi + 0 (1 - \pi) \nonumber = \pi \\ \text{var}(Y) & = & E[Y^2] - E[Y]^2 \nonumber \\ E[Y^2] & = & 1^{2} P(Y = 1) + 0^{2} P(Y = 0) \nonumber \\ & = & \pi \nonumber \\ \text{var}(Y) & = & \pi - \pi^{2} \nonumber \\ & = & \pi(1 - \pi ) \nonumber \end{eqnarray} \]
What is the maximum variance?
\[ \begin{eqnarray} \text{var}(Y) & = & \pi - \pi^{2} \nonumber \\ & = & 0.5(1 - 0.5 ) \\ & = & 0.25 \end{eqnarray} \]
\[Z = \sum_{i=1}^{N} Y_{i} \text{ where } Y_{i} \sim \text{Bernoulli}(\pi)\]
\[ \begin{eqnarray} E[Z] & = & E[Y_{1} + Y_{2} + Y_{3} + \ldots + Y_{N} ] \\ & = & \sum_{i=1}^{N} E[Y_{i} ] \\ & = & N \pi \\ \text{var}(Z) & = & \sum_{i=1}^{N} \text{var}(Y_{i}) \\ & = & N \pi (1-\pi) \end{eqnarray} \]
Then,
\[ \begin{eqnarray} Y &\sim & \text{Bernoulli}(\pi) \end{eqnarray} \]
What is \(1\)’s expected utility from fighting a war?
\[ \begin{eqnarray} E[U(\text{war})] & = & (\text{Utility}|\text{win})\times P(\text{win}) + (\text{Utility}| \text{lose})\times P(\text{lose}) \\ &= & (B - c) P(Y = 1) + (- c) P(Y = 0 ) \\ & = & B \times p(Y = 1) - c(P(Y = 1) + P(Y = 0)) \\ & = & B \times \pi - c \end{eqnarray} \]
Deciding whether to go to war
For a discrete random variable \(X\), \(F_X\) provides the probability \(\Pr (X \leq x)\). For every \(x\)
\[F_X(x) = \Pr (X \leq x) = \sum_{k \leq x} p_X(k)\]
All PMFs have a CDF
If \(X\) is discrete and takes integer values, the PMF and the CDF can be obtained from each other by summing or differencing:
\[F_X(k) = \sum_{i = -\infty}^k p_X(i),\] \[p_X(k) = \Pr (X \leq k) - \Pr (X \leq k-1) = F_X(k) - F_X(k-1)\]
for all integers \(k\)