The ASTA team
Consider an experiment.
The state space \(S\) is the set of all possible outcomes.
Example: We roll a die. The possible outcomes are \(S=\{1,2,3,4,5,6\}\).
Example: We measure wind speed (in m/s). The state space is \([0,\infty)\).
An event is a subset \(A\subseteq S\) of the sample space.
Example: Rolling a die and getting an even number is the event \(A=\{2,4,6\}\).
Example: Measuring a wind speed of at least 5m/s is the event \([5,\infty)\).
Consider two events \(A\) and \(B\).
The union \(A\cup B\) is the event that either \(A\) or \(B\) occurs.
The intersection \(A\cap B\) of is the event that both \(A\) and \(B\) occurs.
Example: We roll a die and consider the events \(A=\{2,4,6\}\) that we get an even number and \(B=\{4,5,6\}\) that we get at least 4. Then
\(A\cup B = \{2,4,5,6\}\)
\(A\cap B = \{4,6\}\)
\(A^c = \{1,3,5\}\)
The probability of an event is the proportion of times the event \(A\) would occur when the experiment is repeated many times.
The probability of the event \(A\) is denoted \(P(A)\).
Example: We throw a coin and consider the outcome \(A=\{Head\}\). We expect to see the outcome \(\{Head\}\) half of the time, so \(P(Head)=\tfrac{1}{2}\).
Example: We throw a die and consider the outcome \(A=\{4\}\). Then \(P(4)=\tfrac{1}{6}\).
Properties:
\(P(S)=1\)
\(P(\emptyset)=0\)
\(0\leq P(A) \leq 1\) for all events \(A\)
Consider two events \(A\) and \(B\).
If \(A\) and \(B\) are mutually exclusive (never occur at the same time, i.e. \(A\cap B=\emptyset\)), then
\[ P(A\cup B) = P(A) + P(B). \]
\[P(A\cup B) = P(A) + P(B) = \tfrac{1}{6} + \tfrac{1}{6} = \tfrac{1}{3}. \]
\[ P(A\cup B) = P(A) + P(B) - P(A\cap B).\]
\[P(A\cup B) = P(A) + P(B) -P(A\cap B)= \tfrac{1}{3} + \tfrac{1}{3} - \tfrac{1}{6} = \tfrac{1}{2}.\]
Consider events \(A\) and \(B\).
The conditional probability of \(A\) given \(B\) is defined by \[ P(A|B) = \frac{P(A\cap B)}{P(B)}\] if \(P(B)>0\).
Example: We toss a coin two times. The possible outcomes are \(S=\{HH,HT,TH,TT\}\). Each outcome has probability \(\tfrac{1}{4}\). What is the probability of at least one head if we know there was at least one tail?
Two events \(A\) and \(B\) are said to be independent if \[ P(A|B) = P(A).\]
Example: Consider again a coin tossed two times with possible outcomes \(HH,HT,TH,TT\).
Let \(A=\{\text{at least one H}\}\) and \(B=\{\text{at least one T}\}\).
We found that \(P(A|B) = \tfrac{2}{3}\) while \(P(A) = \tfrac{3}{4}\), so \(A\) and \(B\) are not independent.
Two events \(A\) and \(B\) are independent if and only if \[ P(A\cap B) = P(A)P(B).\]
Proof: \(A\) and \(B\) are independent if and only if \[ P(A)=P(A| B) = \frac{P(A\cap B)}{P(B)}. \] Multiplying by \(P(B)\) we get \(P(A)P(B)=P(A\cap B)\).
Example: Roll a die and let \(A=\{2,4,6\}\) be the event that we get an even number and \(B=\{1,2\}\) the event that we get at most 2. Then,
A stochastic variable is a function that assigns a real number to every element of the state space.
Example: Throw a coin three times. The possible outcomes are \[S=\{HHH,HHT,HTH,HTT,THH,THT,TTH,TTT\}.\]
Example: Consider the question whether a certain machine is defect. Define
Example: \(X\) is the temperature in the lecture room.
A stochastic variable \(X\) may be
Discrete: \(X\) can take a finite or infinite list of values.
Examples:
Number of heads in 3 coin tosses (can take values \(0,1,2,3\))
Number of machines that break down over a year (can take values \(0,1,2,3,\ldots\))
Continuous: \(X\) takes values on a continuous scale.
Examples:
Let \(X\) be a discrete stochastic variable which can take the values \(x_1,x_2,\ldots\)
The distribution of \(X\) is given by the probability function, which is given by \[f(x_i)=P(X=x_i), \quad i=1,2,\ldots\]
Example: We throw a coin three times and let \(X\) be the number of heads. The possible outcomes are \[S=\{HHH,HHT,HTH,HTT,THH,THT,TTH,TTT\}.\] The probability function is
Let \(X\) be a discrete random variable with probability function \(f\). The distribution function of \(X\) is given by \[F(x)=P(X\leq x) = \sum_{x_i \leq x} f(x_i), \quad x\in \mathbb{R}.\]
Example: For the three coin tosses, we have
The mean or expected value of a discrete random variable \(X\) with values \(x_1,x_2,\ldots\) and probability function \(f(x_i)\) is \[\mu = E(X) = \sum_{i} x_iP(X=x_i) = \sum_{i} x_if(x_i).\]
Interpretation: A weighted average of the possible values of \(X\), where each value is weighted by its probability. A sort of “center” value for the distribution.
The variance is the mean squared distance between the values of the variable and the mean value. More precisely, \[\sigma^2 = \sum_{i} (x_i-\mu)^2P(X=x_i) = \sum_{i} (x_i-\mu)^2f(x_i).\]
A high variance indicates that the values of \(X\) have a high probability of being far from the mean values.
The standard deviation is the square root of the variance \[\sigma = \sqrt{\sigma^2}.\]
The advantage of the standard deviation over the variance is that it is measured in the same units as \(X\).
Example Let \(X\) be the number of heads in 3 coin tosses. What is the variance and standard deviation?
The distribution of a continuous random variable \(X\) is given by a probability density function \(f\), which is a function satisfying
\(f(x)\) is defined for all \(x\) in \(\mathbb{R}\),
\(f(x)\geq 0\) for all \(x\) in \(\mathbb{R}\),
\(\int_{-\infty}^{\infty} f(x)dx = 1\).
The probability that \(X\) lies between the values \(a\) and \(b\) is given by
\[P(a<X<b) = \int_a^b f(x) dx.\]
Notes:
Condition 3. ensures that \(P(-\infty < X < \infty) = 1\).
The probability of \(X\) assuming a specific value \(a\) is zero, i.e. \(P(X=a)=0\).
The uniform distribution on the interval \((A,B)\) has density \[ f(x)= \begin{cases} \frac{1}{B-A} & A \leq x \leq B \\ 0 & \text{otherwise} \end{cases} \]
Example: If \(X\) has a uniform distribution on \((0,1)\), find \(P(\tfrac{1}{3}<X\leq \tfrac{2}{3})\).
\[P\left(\tfrac{1}{3}<X\leq \tfrac{2}{3}\right) =P\left(\tfrac{1}{3}<X < \tfrac{2}{3}\right) + P\left(X = \tfrac{2}{3}\right)\\ = \int_{1/3}^{2/3}f(x)dx + 0 =\int_{1/3}^{2/3}1dx = \frac{1}{3}.\]
Let \(X\) be a continuous random variable with probability density \(f\). The distribution function of \(X\) is given by \[F(x)=P(X\leq x) = \int_{-\infty}^{x} f(y) dy, \quad x\in \mathbb{R}.\]
Consider again the uniform distribution on the interval \((0,1)\) with density \[ f(x)= \begin{cases} 1 & 0 \leq x \leq 1 \\ 0 & \text{otherwise} \end{cases} \] Find the mean and variance.
Solution: The mean is \[\mu = E(X) =\int_{-\infty}^\infty xf(x) dx = \int_{0}^1 x \cdot 1 dx = \left[\tfrac{1}{2}x^2\right]_0^1 = \tfrac{1}{2},\] and the variance is computed using the formula \[\sigma^2 = E(X^2) - E(X)^2 = \int_{-\infty}^\infty x^2 f(x) dx-\mu^2 = \int_{0}^1 x^2dx-\mu^2 \\ = \left[\tfrac{1}{3}x^3\right]_0^1-\Big(\tfrac{1}{2}\Big)^2 = \tfrac{1}{3} - \tfrac{1}{4} = \tfrac{1}{12}.\]
Let \(X\) be a random variable and \(a,b\) be constants. Then,
Example: If \(X\) has mean \(\mu\) and variance \(\sigma^2\), then